CN108763514B - Information display method and mobile terminal - Google Patents

Information display method and mobile terminal Download PDF

Info

Publication number
CN108763514B
CN108763514B CN201810542452.0A CN201810542452A CN108763514B CN 108763514 B CN108763514 B CN 108763514B CN 201810542452 A CN201810542452 A CN 201810542452A CN 108763514 B CN108763514 B CN 108763514B
Authority
CN
China
Prior art keywords
video image
mobile terminal
display screen
target
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810542452.0A
Other languages
Chinese (zh)
Other versions
CN108763514A (en
Inventor
邱靖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810542452.0A priority Critical patent/CN108763514B/en
Publication of CN108763514A publication Critical patent/CN108763514A/en
Application granted granted Critical
Publication of CN108763514B publication Critical patent/CN108763514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions

Landscapes

  • Engineering & Computer Science (AREA)
  • Environmental & Geological Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

The invention relates to the technical field of communication, and provides an information display method and a mobile terminal, which aim to solve the problem that the existing mobile terminal has a single interaction mode. The method comprises the following steps: acquiring a first scene feature of a first object; acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object; displaying the target video image on the at least one display screen. Therefore, the user does not need to operate on the screen of the mobile terminal, but can make interactive action towards the mobile terminal, and the mobile terminal displays the video image matched with the user according to the interactive action of the user, so that the interactive mode is flexible.

Description

Information display method and mobile terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an information display method and a mobile terminal.
Background
With the rapid development of mobile terminals, the use of mobile terminals by users is not limited to communication, but also includes social and entertainment aspects. When the mobile terminal is used daily, a user can control the mobile terminal to display corresponding information by performing touch operation on a screen. However, as the requirement of the user on the mobile terminal increases, the interaction mode for controlling the display information of the mobile terminal through the touch operation of the screen is single, and the requirement of the interaction between the user and the mobile terminal is difficult to meet.
Disclosure of Invention
The embodiment of the invention provides an information display method and a mobile terminal, and aims to solve the problem that the existing mobile terminal is single in interaction mode.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an information display method, including:
acquiring a first scene feature of a first object;
acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object;
displaying the target video image on the at least one display screen.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the first acquisition module is used for acquiring first scene characteristics of a first object;
the acquisition module is used for acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic and acquiring a target video image containing the target object;
and the display module is used for displaying the target video image on the at least one display screen.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the information display method as described above when executing the computer program.
In a fourth aspect, the embodiment of the present invention further provides a readable storage medium, where a computer program is stored on the readable storage medium, and when the computer program is executed by a processor, the computer program implements the steps in the information display method described above.
In the embodiment of the invention, a first scene characteristic of a first object is collected; acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object; displaying the target video image on the at least one display screen. Therefore, the user does not need to operate on the screen of the mobile terminal, but can make interactive action towards the mobile terminal, and the mobile terminal displays the video image matched with the user according to the interactive action of the user, so that the interactive mode is flexible.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flow chart of an information display method according to an embodiment of the present invention;
FIG. 2 is a second flowchart of an information display method according to an embodiment of the present invention;
FIG. 3 is a third flowchart of an information displaying method according to an embodiment of the present invention;
fig. 4 is one of the structural diagrams of a mobile terminal according to an embodiment of the present invention;
fig. 5 is a second block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 6 is a structural diagram of a display module in a mobile terminal according to an embodiment of the present invention;
fig. 7 is a third block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a fourth structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of an information display method according to an embodiment of the present invention, and as shown in fig. 1, the method includes the following steps:
step 101, collecting a first scene characteristic of a first object.
The first object may be a person or an object, and the first scene feature may be biometric information or a shape feature of the first object.
For example, when the first object is a human hand, the first scene feature may be a feature of a gesture, such as the gesture being "heart" -type; when the first object is an object, the first scene feature may be a shape feature of the object; when the first object is a human lip, the first scene feature may be a feature of the lip.
The mobile terminal can specifically acquire a first scene feature of the first object through the camera.
And 102, acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object.
The database may include video images for presentation pre-stored by the mobile terminal.
In this step, the mobile terminal compares the first scene feature with the feature of the object in the pre-stored video image, determines a target object with a feature matching the first object, that is, the similarity between the feature of the target object and the first scene feature of the first object is greater than a preset value, and acquires a target video image containing the target object.
And 103, displaying the target video image on the at least one display screen.
In this step, the target video image is displayed on any one display screen or all display screens of the mobile terminal. For example, the mobile terminal detects that the user makes a "heart-to-heart" gesture, determines that an image matching the gesture is a heart-shaped image based on the characteristics of the gesture, and displays a video image including the heart-shaped image on a screen.
Due to the fact that the target object is matched with the characteristics of the first object, the target video image can achieve the effect of simulating the first object, and therefore interaction with the mobile terminal is achieved.
In the embodiment of the present invention, the information display method may be applied to a mobile terminal having at least one display screen, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The information display method of the embodiment of the invention comprises the steps of collecting first scene characteristics of a first object; acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object; displaying the target video image on the at least one display screen. Therefore, the user does not need to operate on the screen of the mobile terminal, but can make interactive action towards the mobile terminal, and the mobile terminal displays the video image matched with the user according to the interactive action of the user, so that the interactive mode is flexible.
Referring to fig. 2, the present embodiment is mainly different from the above-described embodiments in that a target video image is determined based on a first scene characteristic and a second scene characteristic. Fig. 2 is a flowchart of an information display method according to an embodiment of the present invention, and as shown in fig. 2, the method includes the following steps:
step 201, a first scene feature of a first object is acquired.
The implementation manner of this step may refer to the description in step 101, and is not described herein again to avoid repetition.
Step 202, collecting a second scene characteristic of the first object.
The second scene feature may be a feature associated with the first object, or an operation action of the first object or an environment where the first object is located. For example, when the first object is a gesture, the second scene feature may be a moving direction of a hand; when the first object is a lip, the second scene feature may be a smoke ring spitted by the lip.
The mobile terminal may specifically acquire the second scene characteristic of the first image through a distance sensor, a sound sensor, and the like.
Optionally, the second scene characteristic includes: a direction of movement of the first object; when the moving distance of the first object includes a distance perpendicular to any one display screen of the mobile terminal, the size variation trend of the target object in the target video image is positively or negatively correlated with the distance variation trend of the first object to the mobile terminal.
In this embodiment, the first object may gradually approach or move away from a display screen of the mobile terminal, which may be any display screen of the mobile terminal. When the first object is close to or far away from the display screen of the mobile terminal, the mobile terminal can move along the direction vertical to the display screen, and the moving distance can also exist in the direction vertical to the display screen. The mobile terminal may detect the approach or the departure of the first object through the distance sensor to determine a moving direction of the first object, thereby determining a size change tendency of the target object according to the moving direction.
The size change trend of the target object is positively correlated with the distance change trend from the first object to the mobile terminal, and the target object is understood to be changed from big to small when the distance from the first object to the mobile terminal is changed from big to small, and from small to big when the distance from the first object to the mobile terminal is changed from small to big; the size change trend of the target object is negatively correlated with the distance change trend from the first object to the mobile terminal, and the target object is understood to be changed from small to large when the distance from the first object to the mobile terminal is changed from large to small, and from large to small when the distance from the first object to the mobile terminal is changed from small to large. In the process that the target object is changed from small to big or from big to small, the three-dimensional effect that the target object is close to or far away from the mobile terminal is presented, and the effect of restoring the real scene is achieved.
For example, the user gesture moves towards the display screen along the direction perpendicular to the display screen of the mobile terminal, the mobile terminal determines a heart-shaped image matched with the user gesture according to the user gesture characteristics, and determines a video image comprising heart-shaped images from big to small according to the moving direction of the hand.
It should be noted that the above is merely illustrative. In the case where the mobile terminal includes a plurality of display screens, the same or different video contents may be displayed in different display screens, that is, the size change tendencies of the target object in different display screens are the same or different, and in addition, may be displayed only in a single display screen.
In the embodiment, the user can determine the size change trend of the target object by changing the distance between the user and the screen of the mobile terminal, so that the interaction with the mobile terminal is realized, the interaction mode is flexible and diversified, and the target image can show a multi-dimensional stereo effect.
Step 203, based on the first scene feature, obtaining the object to be selected matched with the first object from a pre-stored database.
The database may include video images for presentation pre-stored by the mobile terminal.
In this step, the mobile terminal compares the first scene feature with the feature of the object in the pre-stored video image, and determines the object to be selected whose feature matches the first object, that is, the similarity between the feature of the object to be selected and the first scene feature of the first object is greater than the preset phase value.
And 204, acquiring a target video image matched with the second scene characteristic from the video image containing the object to be selected.
After the object to be selected is determined, in the step, the features of the video image containing the object to be selected are further identified, so that in the video image containing the object to be selected, a target video image with the similarity between the features and the second scene features larger than a preset value is identified, and the target video image is a video image matched with the second scene features.
For example, if the first object is a lip, the first scene object is a feature of the lip, and the second scene object is a smoke ring spit out by the mouth, the mobile terminal acquires video images matching the feature of the lip from a pre-stored database, and then further acquires video images including the smoke ring in the video images.
Step 205, displaying the target video image on the at least one display screen.
In this step, the target video image is displayed on any one display screen or all display screens of the mobile terminal. For example, the mobile terminal detects that the user makes a "heart-to-heart" gesture, namely a first scene feature, and the mobile terminal determines that an image matched with the gesture is a heart-shaped image based on the feature of the gesture; further, the mobile terminal detects that the hand of the user approaches the screen from far to near, namely the second scene characteristic, determines a video image comprising the heart-shaped image from small to large, and displays the video image on the screen.
The target video image is matched with the first scene characteristic and the second scene characteristic, so that the target video image can achieve the effect of simulating the first object, and interaction with the mobile terminal is achieved.
Optionally, the mobile terminal includes a first display screen and a second display screen, and the target video image includes a first video image and a second video image; the step of displaying the target video image on the at least one display screen comprises: displaying the first video image on the first display screen, wherein the size change trend of a target object in the first video image is positively correlated with the distance change trend from the first object to the mobile terminal; after the step of displaying the first video image on the first display screen, the method further comprises: displaying the second video image on the second display screen, wherein the size change trend of the target object in the second video image is opposite to the size change trend of the target object in the first video image.
Wherein the trend of the size change of the target object in the first video image and the second video image may be opposite trends. In the case that the mobile terminal includes a first display screen and a second display screen, a first video image is displayed on the first display screen, wherein the size of a target image in the first video image decreases as the distance from the first object to the mobile terminal decreases, and increases as the distance from the first object to the mobile terminal increases, i.e. the size change of the target object and the distance change from the first object to the mobile terminal are positively correlated.
When the target image is increased or decreased to a preset value, or when the increase or decrease amplitude of the target image reaches the preset value, the mobile terminal may display a second video image in the second display screen, wherein the size change trend of the target image in the second video image is opposite to the size change trend of the target image in the first video image.
In this way, the target object in the first video image and the target object in the second video image respectively include the trends from small to large and from large to small, and the effect that the target object passes through the second display screen from the first display screen can be presented, or the effect that the target object gradually gets away along with the distance of the first object is presented in the second display screen. The first display screen and the second display screen can be any display screen of the mobile terminal.
For example, the user makes a heart-to-heart gesture on one side of the first display screen and gradually approaches the first display screen, and the mobile terminal detects gesture features and a moving direction. Along with the approaching of the gesture of the user, the heart-shaped image displayed in the first display screen has the characteristics of changing from big to small, changing from dark to light and the like, and further, the definition can be reduced, so that the effect of changing the heart-shaped image from near to far is presented on the first display screen; and then displaying the characteristics of the heart-shaped image, such as changing from small to large, changing from light to dark and the like in the second display screen, and further improving the definition, so that the effect of changing the heart-shaped image from far to near is presented in the second display screen. By gradually changing the attribute characteristics of the heart-shaped image, the heart-shaped image is slowly melted from the first display screen to the second display screen and slowly displayed, and the effect of three-dimensional change is realized.
In the embodiment, the effect of simulating a real scene is achieved by changing the characteristics of the target object, the three-dimensional and multi-dimensional feeling is provided for a user, and the interestingness of the interaction mode is enhanced. This embodiment can also be applied to the corresponding embodiment of fig. 1 and achieve the same advantageous effects.
Optionally, the moving direction of the first object is a direction toward the mobile terminal; the step of displaying the second video image on the second display screen includes: when the image size of the target object in the first display screen is smaller than a preset value, displaying the second video image on the second display screen; and the initial size of the target object on the second display screen is the preset value.
When the target object in the first display screen is reduced to the preset value, the target object in the second display screen is increased by the preset value, namely, the size of the target object initially displayed in the second display screen is the preset value. Therefore, the target object can be displayed by being melted from the first display screen to the second display screen, the effect of simulating a real scene is improved, and the effect of three-dimensional and multi-dimensional is displayed.
For example, a user blows air through a mouth on one side of the first display screen, the first display screen displays the cigarette ring, the cigarette ring is gradually reduced, when the cigarette ring is reduced to a preset value, the second display screen displays the cigarette ring, the initial size of the cigarette ring is the preset value, and then the cigarette ring is gradually increased, so that the cigarette ring is integrated into the screen from the first display screen, the effect of display through the second display screen is achieved, and interestingness is enhanced.
Optionally, after the step of displaying the target video image on the at least one display screen, the method further includes: and outputting preset five-sense characteristic information matched with the first scene characteristic and/or the second scene characteristic.
In this embodiment, in order to further improve the real effect of the scene simulation, after the target video image is displayed, five-sense feature information is further output, where the five-sense feature information may be information for presenting a multi-dimensional effect, and specifically, the five-sense feature information related to the five-sense feature information may be preset according to the attribute feature of the first object. For example, the mobile terminal outputs smell, air blowing, vibration, and the like. These five-sense feature information may be matched with a first scene feature of a first object, or matched with a second scene feature, or matched with both the first scene feature and the second scene feature.
For example, a user makes a gesture which is closer to the heart than the heart on one side of the first display screen, the mobile terminal displays a heart-shaped image on the first display screen, and the heart-shaped image is changed from big to small and the color is changed from dark to light; and then displaying the heart-shaped image on a second display screen, wherein the heart-shaped image is changed from small to big and the color is changed from light to dark. Meanwhile, the mobile terminal vibrates, the vibration frequency is such as the frequency of heartbeat, the user can feel the effect of touching the heartbeat by touching the screen of the mobile terminal with hands, and further, the sound of the heartbeat is output.
In the scene of the above-mentioned demonstration smoke ring, after showing the smoke ring in the screen, can further export the smell of cigarette, perhaps blow, the size of wind is equivalent with the touch dynamics that the gas blown to simulate real scene, show the effect and the visual impact of multidimension solid.
In the embodiment, by outputting the five-sense characteristic information, the real effect of the scene is further improved, the reality sense of the multi-dimensional stereo is improved, and a more interesting interaction mode is presented.
It is to be noted that this embodiment can be applied to any of the above embodiments and achieves the same advantageous effects.
To facilitate understanding of the scheme of this example, specific examples of the embodiment are given below.
The method comprises the steps that a database A, a database B and a database C are preset on the mobile terminal, wherein the database A is a scene feature library and is used for setting scene feature labels for behavior operations of terminal users; the database B is an interactive action library and is used for matching the feature labels under the scene feature library with the corresponding interactive actions in the enhanced display simulation process; the database C is a five-sense effect library and is used for matching final effect contents to the scene feature library and the interactive action library.
The following is a specific implementation process, and as shown in fig. 3, the information display method includes:
and 301, detecting the label in real time by the terminal.
The terminal detects information such as current environment, user biological characteristic information, user real-time action and the like in real time through a sensor, and the information represents various labels used as scene trigger conditions.
Step 302, trigger scenario.
Using the obtained tags, database a is searched to determine scene features matching the tags obtained in step 301.
Step 303, interactive action matching.
The tags are matched to the interactive features in database B.
And 304, acquiring five-sense characteristics matched with the scene characteristics and the interactive actions in the database C, and displaying the corresponding effects.
According to the information display method, the first scene characteristic and the second scene characteristic of the first object are collected, and the target video image matched with the scene characteristic of the first object is displayed, so that a real scene is simulated, and a multi-dimensional stereo effect is presented.
Referring to fig. 4, fig. 4 is a structural diagram of a mobile terminal according to an embodiment of the present invention, and as shown in fig. 4, the mobile terminal 400 includes: a first acquisition module 401, an acquisition module 402 and a display module 403.
A first acquiring module 401, configured to acquire a first scene feature of a first object;
an obtaining module 402, configured to obtain, based on the first scene feature, a target object matching the first object from a pre-stored database, and obtain a target video image including the target object;
a display module 403, configured to display the target video image on the at least one display screen.
Optionally, as shown in fig. 5, the mobile terminal further includes:
a second acquisition module 404, configured to acquire a second scene characteristic of the first object;
the obtaining module 402 includes:
the first obtaining sub-module 4021 is configured to obtain an object to be selected, which is matched with the first object, from a pre-stored database based on the first scene feature;
the second obtaining sub-module 4022 is configured to obtain, from the video image including the object to be selected, a target video image matched with the second scene feature.
Optionally, the second scene characteristic includes: a direction of movement of the first object;
when the moving distance of the first object includes a distance perpendicular to any one display screen of the mobile terminal, the size variation trend of the target object in the target video image is positively or negatively correlated with the distance variation trend of the first object to the mobile terminal.
Optionally, as shown in fig. 6, the mobile terminal includes a first display screen and a second display screen, and the target video image includes a first video image and a second video image;
the display module 403 includes:
a first display sub-module 4031 configured to display the first video image on the first display screen, where a size variation trend of a target object in the first video image is positively correlated to a distance variation trend from the first object to the mobile terminal;
a second display sub-module 4032 configured to display the second video image on the second display screen, where a trend of a size change of a target object in the second video image is opposite to a trend of a size change of a target object in the first video image.
Optionally, the moving direction of the first object is a direction toward the mobile terminal;
the second display sub-module 4032 is specifically configured to, when the image size of the target object in the first display screen is smaller than a preset value, display the second video image on the second display screen;
and the initial size of the target object on the second display screen is the preset value.
Optionally, as shown in fig. 7, the mobile terminal further includes:
an output module 405, configured to output preset five-sense feature information matched with the first scene feature and/or the second scene feature.
The mobile terminal 400 can implement each process implemented by the mobile terminal in the above method embodiments, and is not described here again to avoid repetition.
According to the mobile terminal 400 of the embodiment of the invention, the user does not need to operate on the screen of the mobile terminal, but can make an interactive action towards the mobile terminal, and the mobile terminal displays the video image matched with the user according to the interactive action of the user, so that the interactive mode is flexible.
Fig. 8 is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, where the mobile terminal 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 8 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted mobile terminal, a wearable device, a pedometer, and the like.
The processor 810 is configured to acquire a first scene characteristic of a first object; acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object; displaying the target video image on the at least one display screen. Therefore, the user does not need to operate on the screen of the mobile terminal, but can make interactive action towards the mobile terminal, and the mobile terminal displays the video image matched with the user according to the interactive action of the user, so that the interactive mode is flexible.
Optionally, after the processor 810 performs the step of acquiring the first scene characteristic of the first object, the method further includes: acquiring a second scene feature of the first object; the step of acquiring a target object matched with the first object from a pre-stored database based on the first scene feature and acquiring a target video image containing the target object comprises the following steps: acquiring an object to be selected matched with the first object from a pre-stored database based on the first scene characteristic; and acquiring a target video image matched with the second scene characteristic from the video image containing the object to be selected.
Optionally, the second scene characteristic includes: a direction of movement of the first object; when the moving distance of the first object includes a distance perpendicular to any one display screen of the mobile terminal, the size variation trend of the target object in the target video image is positively or negatively correlated with the distance variation trend of the first object to the mobile terminal.
Optionally, the mobile terminal includes a first display screen and a second display screen, and the target video image includes a first video image and a second video image; the processor 810 performs the step of displaying the target video image on the at least one display screen, including: displaying the first video image on the first display screen, wherein the size change trend of a target object in the first video image is positively correlated with the distance change trend from the first object to the mobile terminal; after the step of displaying the first video image on the first display screen, the method further comprises: displaying the second video image on the second display screen, wherein the size change trend of the target object in the second video image is opposite to the size change trend of the target object in the first video image.
Optionally, the moving direction of the first object is a direction toward the mobile terminal; processor 810 performs the step of displaying the second video image on the second display screen, including: when the image size of the target object in the first display screen is smaller than a preset value, displaying the second video image on the second display screen; and the initial size of the target object on the second display screen is the preset value.
Optionally, the processor 810 is further configured to output preset five-sense feature information matched with the first scene feature and/or the second scene feature.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 802, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the mobile terminal 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The mobile terminal 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the mobile terminal 800 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The Display unit 806 may include a Display panel 8061, and the Display panel 8061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation is transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 808 is an interface through which an external device is connected to the mobile terminal 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 800 or may be used to transmit data between the mobile terminal 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby integrally monitoring the mobile terminal. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The mobile terminal 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and the power supply 811 may be logically coupled to the processor 810 via a power management system that may be used to manage charging, discharging, and power consumption.
In addition, the mobile terminal 800 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program, when executed by the processor 810, implements each process in the above information display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned information display method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a mobile terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (9)

1. An information display method applied to a mobile terminal having at least one display screen, comprising:
acquiring a first scene feature of a first object;
acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic, and acquiring a target video image containing the target object;
displaying the target video image on the at least one display screen;
after the step of acquiring a first scene feature of a first object, the method further comprises:
acquiring a second scene feature of the first object;
the step of acquiring a target object matched with the first object from a pre-stored database based on the first scene feature and acquiring a target video image containing the target object comprises the following steps:
acquiring an object to be selected matched with the first object from a pre-stored database based on the first scene characteristic;
and acquiring a target video image matched with the second scene characteristic from the video image containing the object to be selected.
2. The information display method according to claim 1, wherein the second scene feature includes: a direction of movement of the first object;
when the moving distance of the first object includes a distance perpendicular to any one display screen of the mobile terminal, the size variation trend of the target object in the target video image is positively or negatively correlated with the distance variation trend of the first object to the mobile terminal.
3. The information display method according to claim 2, wherein the mobile terminal includes a first display screen and a second display screen, and the target video image includes a first video image and a second video image;
the step of displaying the target video image on the at least one display screen comprises:
displaying the first video image on the first display screen, wherein the size change trend of a target object in the first video image is positively correlated with the distance change trend from the first object to the mobile terminal;
after the step of displaying the first video image on the first display screen, the method further comprises:
displaying the second video image on the second display screen, wherein the size change trend of the target object in the second video image is opposite to the size change trend of the target object in the first video image.
4. The information display method according to claim 3, wherein the moving direction of the first object is a direction toward the mobile terminal;
the step of displaying the second video image on the second display screen includes:
when the image size of the target object in the first display screen is smaller than a preset value, displaying the second video image on the second display screen;
and the initial size of the target object on the second display screen is the preset value.
5. A mobile terminal having at least one display screen, comprising:
the first acquisition module is used for acquiring first scene characteristics of a first object;
the acquisition module is used for acquiring a target object matched with the first object from a pre-stored database based on the first scene characteristic and acquiring a target video image containing the target object;
a display module for displaying the target video image on the at least one display screen;
a second acquisition module for acquiring a second scene characteristic of the first object;
the acquisition module includes:
the first obtaining submodule is used for obtaining an object to be selected matched with the first object from a pre-stored database based on the first scene characteristic;
and the second obtaining submodule is used for obtaining a target video image matched with the second scene characteristic in the video image containing the object to be selected.
6. The mobile terminal of claim 5, wherein the second scene characteristic comprises: a direction of movement of the first object;
when the moving distance of the first object includes a distance perpendicular to any one display screen of the mobile terminal, the size variation trend of the target object in the target video image is positively or negatively correlated with the distance variation trend of the first object to the mobile terminal.
7. The mobile terminal of claim 6, wherein the mobile terminal comprises a first display screen and a second display screen, and wherein the target video image comprises a first video image and a second video image;
the display module includes:
the first display sub-module is used for displaying the first video image on the first display screen, wherein the size change trend of a target object in the first video image is positively correlated with the distance change trend from the first object to the mobile terminal;
and the second display sub-module is used for displaying the second video image on the second display screen, wherein the size change trend of the target object in the second video image is opposite to the size change trend of the target object in the first video image.
8. The mobile terminal of claim 7, wherein the moving direction of the first object is a direction toward the mobile terminal;
the second display sub-module is specifically configured to display the second video image on the second display screen when the image size of the target object in the first display screen is smaller than a preset value;
and the initial size of the target object on the second display screen is the preset value.
9. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps in the information display method according to any one of claims 1 to 4.
CN201810542452.0A 2018-05-30 2018-05-30 Information display method and mobile terminal Active CN108763514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810542452.0A CN108763514B (en) 2018-05-30 2018-05-30 Information display method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810542452.0A CN108763514B (en) 2018-05-30 2018-05-30 Information display method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108763514A CN108763514A (en) 2018-11-06
CN108763514B true CN108763514B (en) 2021-01-26

Family

ID=64004417

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810542452.0A Active CN108763514B (en) 2018-05-30 2018-05-30 Information display method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108763514B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110609921B (en) * 2019-08-30 2022-08-19 联想(北京)有限公司 Information processing method and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104898972A (en) * 2015-05-19 2015-09-09 青岛海信移动通信技术股份有限公司 Method and equipment for regulating electronic image
CN105334962A (en) * 2015-11-02 2016-02-17 深圳奥比中光科技有限公司 Method and system for zooming screen image by gesture
CN106502515A (en) * 2016-09-30 2017-03-15 维沃移动通信有限公司 A kind of picture input method and mobile terminal
CN107885317A (en) * 2016-09-29 2018-04-06 阿里巴巴集团控股有限公司 A kind of exchange method and device based on gesture

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9111135B2 (en) * 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104898972A (en) * 2015-05-19 2015-09-09 青岛海信移动通信技术股份有限公司 Method and equipment for regulating electronic image
CN105334962A (en) * 2015-11-02 2016-02-17 深圳奥比中光科技有限公司 Method and system for zooming screen image by gesture
CN107885317A (en) * 2016-09-29 2018-04-06 阿里巴巴集团控股有限公司 A kind of exchange method and device based on gesture
CN106502515A (en) * 2016-09-30 2017-03-15 维沃移动通信有限公司 A kind of picture input method and mobile terminal

Also Published As

Publication number Publication date
CN108763514A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN109215007B (en) Image generation method and terminal equipment
CN109240577B (en) Screen capturing method and terminal
CN108415652B (en) Text processing method and mobile terminal
CN109218648B (en) Display control method and terminal equipment
CN109874038B (en) Terminal display method and terminal
CN109409244B (en) Output method of object placement scheme and mobile terminal
CN111124245B (en) Control method and electronic equipment
CN112689201B (en) Barrage information identification method, barrage information display method, server and electronic equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN109388456B (en) Head portrait selection method and mobile terminal
CN107748640B (en) Screen-off display method and mobile terminal
CN109495616B (en) Photographing method and terminal equipment
CN109634438B (en) Input method control method and terminal equipment
CN108509141B (en) Control generation method and mobile terminal
CN108600079B (en) Chat record display method and mobile terminal
CN110930410A (en) Image processing method, server and terminal equipment
CN111641861B (en) Video playing method and electronic equipment
CN110866465A (en) Control method of electronic equipment and electronic equipment
CN109947988B (en) Information processing method and device, terminal equipment and server
CN110795002A (en) Screenshot method and terminal equipment
CN108765522B (en) Dynamic image generation method and mobile terminal
CN111405361B (en) Video acquisition method, electronic equipment and computer readable storage medium
CN108763514B (en) Information display method and mobile terminal
CN110471895B (en) Sharing method and terminal device
CN108471549B (en) Remote control method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant