CN107370660B - Information perception method and mobile terminal - Google Patents

Information perception method and mobile terminal Download PDF

Info

Publication number
CN107370660B
CN107370660B CN201710423564.XA CN201710423564A CN107370660B CN 107370660 B CN107370660 B CN 107370660B CN 201710423564 A CN201710423564 A CN 201710423564A CN 107370660 B CN107370660 B CN 107370660B
Authority
CN
China
Prior art keywords
information
touch
motor
mobile terminal
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710423564.XA
Other languages
Chinese (zh)
Other versions
CN107370660A (en
Inventor
韩旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710423564.XA priority Critical patent/CN107370660B/en
Publication of CN107370660A publication Critical patent/CN107370660A/en
Application granted granted Critical
Publication of CN107370660B publication Critical patent/CN107370660B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information

Abstract

The embodiment of the invention provides an information perception method and a mobile terminal, and relates to the field of communication. According to the embodiment of the invention, the first information sent by the sending end is received, the first information comprises one or more of touch information, character information and smell information, and the simulation display is carried out according to the first information. On the basis of current visual and auditory communication, touch and olfactory information is added, and simulation display is carried out according to touch information and odor information sent by a sending end, so that touch information of the limbs of the opposite side and odor information of the environment where the opposite side is located are sensed, the authenticity of communication is improved, and the experience effect of user communication is enriched; meanwhile, the simulated display is carried out according to the text information sent by the sending end, so that the blind user can sense the text information sent by the other side under the condition of inconvenient voice, and therefore effective communication can be carried out.

Description

Information perception method and mobile terminal
Technical Field
The embodiment of the invention relates to the field of communication, in particular to an information perception method and a mobile terminal.
Background
With the rapid development of computer technology and communication technology, the modernization and comfort of life are promoted, and people can carry out chat communication through videos or characters under the condition that people cannot carry out face-to-face communication.
At present, the video chat technology can realize the video conversation between two parties, can transmit the video and audio information of the other party, and carry out visual and auditory communication; for blind users, communication can be carried out only through voice.
When the inventor applies the prior art, the inventor finds that the prior art cannot transmit touch and smell information for the current video chat technology, and further cannot sense the touch information of the limbs of the opposite side and the smell information of the environment where the opposite side is located, so that the authenticity of communication is reduced; for blind users, under the condition of inconvenient voice, the users cannot sense the character information sent by the opposite side, so that effective communication cannot be carried out.
Disclosure of Invention
The embodiment of the invention provides an information perception method and a mobile terminal, and aims to solve the problems that the prior art cannot transmit touch and smell information and a blind user cannot perceive character information sent by the other side.
In a first aspect, an embodiment of the present invention provides an information sensing method, which is applied to a mobile terminal, and the method includes:
receiving first information sent by a sending end; the first information comprises one or more of touch information, character information and smell information;
and performing simulation display according to the first information.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the receiving module is used for receiving first information sent by the sending end; the first information comprises one or more of touch information, character information and smell information;
and the display module is used for performing simulation display according to the first information.
In this way, in the embodiment of the present invention, by receiving the first information sent by the sending end, the first information includes one or more of touch information, text information, and odor information, and the simulation display is performed according to the first information. On the basis of current visual and auditory communication, touch and olfactory information is added, and simulation display is carried out according to touch information and odor information sent by a sending end, so that touch information of the limbs of the opposite side and odor information of the environment where the opposite side is located are sensed, the authenticity of communication is improved, and the experience effect of user communication is enriched; meanwhile, the simulated display is carried out according to the text information sent by the sending end, so that the blind user can sense the text information sent by the other side under the condition of inconvenient voice, and therefore effective communication can be carried out.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
Fig. 1 is a flow chart of a method of information awareness according to a first embodiment of the present invention;
FIG. 2 is a flow chart of a method of information awareness according to a second embodiment of the present invention;
FIG. 3 is a flow chart of a method of information awareness according to a third embodiment of the present invention;
FIG. 4 is a flow chart of a method of information awareness according to a fourth embodiment of the present invention;
fig. 5 shows one of the structural block diagrams of a mobile terminal according to the fifth embodiment of the present invention;
fig. 6 shows a second block diagram of a mobile terminal according to a fifth embodiment of the present invention;
fig. 7 is a third block diagram of a mobile terminal according to a fifth embodiment of the present invention;
fig. 8 is a block diagram of a mobile terminal according to a fifth embodiment of the present invention;
fig. 9 is a block diagram showing a mobile terminal according to a sixth embodiment of the present invention;
fig. 10 shows a schematic structural diagram of a mobile terminal according to a seventh embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Referring to fig. 1, a flowchart of an information sensing method according to a first embodiment of the present invention is shown, which may specifically include the following steps:
step 101, receiving first information sent by a sending end; the first information comprises one or more of touch information, character information and smell information.
In the embodiment of the invention, a sending end user operates on a mobile terminal of the sending end, the mobile terminal of the sending end acquires first information of the sending end user and sends the first information to a mobile terminal of a receiving end, and the mobile terminal of the receiving end receives the first information sent by the mobile terminal of the sending end. Wherein the first information comprises one or more of touch information, text information and smell information.
For example, when two users are in video call, a sending end user performs touch operation on a mobile terminal of a sending end, the mobile terminal of the sending end acquires touch information of the touch operation and sends the touch information to a mobile terminal of a receiving end, and the mobile terminal of the receiving end receives the touch information; a sending end user operates on a mobile terminal of the sending end, so that the mobile terminal of the sending end obtains smell information of the environment where the sending end user is located, the mobile terminal of the sending end sends the smell information to a mobile terminal of a receiving end, and the mobile terminal of the receiving end receives the smell information; the method comprises the steps that a user at a sending end carries out character writing operation on a mobile terminal at the sending end, the mobile terminal at the sending end obtains character information of the character writing operation and sends the character information to a mobile terminal at a receiving end, and the mobile terminal at the receiving end receives the character information.
And 102, performing simulation display according to the first information.
In the embodiment of the invention, after receiving the first information sent by the mobile terminal of the sending end, the mobile terminal of the receiving end carries out simulation display according to the first information, so that a user of the receiving end can perceive the first information.
For example, after receiving the touch information, the mobile terminal at the receiving end performs simulation display according to the touch information to show the touch operation of the user at the sending end, and the user at the receiving end touches the mobile terminal at the receiving end to sense the simulated touch operation; after receiving the smell information, the mobile terminal at the receiving end performs simulation display according to the smell information to restore the smell information of the environment where the sending end user is located, and the receiving end user can sense the restored smell information; after receiving the text information, the mobile terminal at the receiving end carries out simulation display according to the text information, the displayed text information can be directly sensed through touch, and a user at the receiving end touches the mobile terminal at the receiving end, so that the simulated text information can be sensed.
In the embodiment of the invention, by receiving first information sent by a sending end, the first information comprises one or more of touch information, character information and smell information, and simulation display is carried out according to the first information. On the basis of current visual and auditory communication, touch and olfactory information is added, and simulation display is carried out according to touch information and odor information sent by a sending end, so that touch information of the limbs of the opposite side and odor information of the environment where the opposite side is located are sensed, the authenticity of communication is improved, and the experience effect of user communication is enriched; meanwhile, the simulated display is carried out according to the text information sent by the sending end, so that the blind user can sense the text information sent by the other side under the condition of inconvenient voice, and therefore effective communication can be carried out.
Example two
Referring to fig. 2, a flowchart of an information sensing method according to a second embodiment of the present invention is shown, which may specifically include the following steps:
step 201, receiving first touch information and second touch information sent by a sending end; the first touch information comprises touch speed information collected by the touch screen, and the second touch information comprises touch position information and touch pressure information collected by the pressure sensor.
In the embodiment of the invention, a user at a sending end carries out touch operation on a mobile terminal at the sending end, the mobile terminal at the sending end acquires first touch information and second touch information of the touch operation and sends the first touch information and the second touch information to a mobile terminal at a receiving end, and the mobile terminal at the receiving end receives the first touch information and the second touch information. The first touch information comprises touch speed information collected by the touch screen, and the second touch information comprises touch position information and touch pressure information collected by the pressure sensor, wherein the touch speed information is obtained by calculation according to touch displacement information and touch time information of fingers on the touch screen.
When a user at a sending end performs touch operation on a mobile terminal at the sending end, a touch screen acquires touch displacement information of a finger on the touch screen and touch time information of the finger on the touch screen, and touch speed information is obtained by calculating the touch displacement information and the touch time information; the mobile terminal of the sending end is provided with a pressure sensor, and when a user at the sending end carries out touch operation on the mobile terminal of the sending end, the pressure sensor can acquire touch position information and touch pressure information of fingers on the touch screen.
And step 202, determining a corresponding sensing position according to the touch position information.
In the embodiment of the invention, after the mobile terminal at the receiving end receives the first touch information and the second touch information sent by the sending end, the corresponding induction position is determined in the mobile terminal at the receiving end according to the touch position information.
Step 203, adjusting the amplitude and frequency of the motor according to the distance between the induction position and the motor, the touch speed information and the touch pressure information.
In the embodiment of the invention, at least one motor is arranged in the mobile terminal at the receiving end, and the amplitude and the frequency of each motor are determined according to the distance between the induction position and each motor, the touch speed information and the touch pressure information, so that the amplitude and the frequency of each motor are adjusted. Because the sensing position has a certain distance from each motor, the sensed amplitude of the motor has a certain attenuation at the sensing position, and therefore, the amplitude and the frequency of the motor need to be determined according to the touch speed information and the touch pressure information, and the amplitude and the frequency of the motor need to be accurately determined according to the distance from the sensing position to each motor.
When the number of the motors is 1, directly determining the amplitude and the frequency of the motors according to the distance between the induction positions and the motors, the touch speed information and the touch pressure information, wherein the frequency of the motors can simulate the corresponding touch speed information, and the amplitude of the motors can simulate the corresponding touch pressure information; the distance between the induction position and the motor influences the amplitude of the motor, and in the induction position, when the distance between the induction position and the motor is larger, the amplitude attenuation of the motor is larger, and when the distance between the induction position and the motor is smaller, the amplitude attenuation of the motor is smaller. Therefore, in order to sense accurate touch operation at the sensing position, after the amplitude and the frequency of the motor are determined according to the touch speed information and the touch pressure information, the amplitude of the motor is correspondingly increased according to the distance between the sensing position and the motor, and the amplitude attenuation caused by the distance between the sensing position and the motor is compensated.
When the number of the motors is larger than 1, determining the amplitude and the frequency of each motor according to the distance between the induction position and each motor, the touch speed information and the touch pressure information, synthesizing and simulating the frequency of each motor to obtain the corresponding touch speed information, simulating the amplitude of each motor to obtain the corresponding touch pressure information, and finally adjusting the amplitude of each motor according to the distance between the induction position and each motor.
For example, when the number of the motors is 1, the motors may be disposed at the center position of the mobile terminal; when the number of the motors is 4, the 4 motors can be respectively arranged at the positions of four corners of the mobile terminal; the specific number and location of the motors is not limited in this respect.
And 204, performing simulation display through the motor at the induction position.
In the embodiment of the invention, the amplitude and the frequency of each motor are adjusted to simulate and display the touch speed information and the touch pressure information, so that the motor simulates and displays the touch operation of a user at a sending end at an induction position, and the user at a receiving end touches the induction position to sense the simulated touch operation.
When the number of the motors is 1, directly adjusting the amplitude and the frequency of the motors, and finally sensing touch operation obtained through simulation of the amplitude and the frequency of one motor at an induction position; when the number of the motors is larger than 1, the amplitude and the frequency of each motor need to be adjusted, and the amplitude and the frequency of each motor are synthesized, and finally, the touch operation obtained by simulating the synthesized amplitude and frequency of each motor is sensed at the induction position. Wherein the amplitude perceived at the sensing location is the largest.
In the embodiment of the invention, the first touch information and the second touch information sent by the sending end are received, the corresponding induction position is determined according to the touch position information, the amplitude and the frequency of the motor are adjusted according to the distance between the induction position and the motor, the touch speed information and the touch pressure information, and the motor is used for carrying out simulation display at the induction position. On the basis of current vision, sense of hearing communication, increase touch information, simulate the show through the motor according to the touch information that the sending end sent, and then the touch information of perception opposite side's limbs has improved the authenticity of exchanging, has richened the experience effect that the user exchanged.
EXAMPLE III
Referring to fig. 3, a flowchart of an information sensing method according to a third embodiment of the present invention is shown, which may specifically include the following steps:
step 301, receiving the text message sent by the sending end.
In the embodiment of the invention, a user at a sending end carries out character writing operation on a mobile terminal at the sending end, the mobile terminal at the sending end acquires character information of the character writing operation and sends the character information to a mobile terminal at a receiving end, and the mobile terminal at the receiving end receives the character information.
And step 302, converting the text information into braille characters.
In the embodiment of the invention, after the mobile terminal at the receiving end receives the text information, the text information is converted into Braille characters. The braille characters generally consist of six dots, each braille character comprises a left column and a right column, each column comprises three dots, the three dots are respectively called 1 dot, 2 dots and 3 dots from the left column from top to bottom, and the three dots are respectively called 4 dots, 5 dots and 6 dots from the right column from top to bottom.
Step 303, detecting a hand touch position of the user.
In the embodiment of the invention, the receiving end user places a hand on the touch screen, and the touch screen detects the hand touch position of the receiving end user.
The touch screen of the mobile terminal is provided with a plurality of touch sensors (inductors), a hand is placed on the touch screen, the touch sensors can be triggered, and according to the triggered positions of the touch sensors, the hand touch position of a receiving end user can be determined.
And step 304, determining a plurality of sensing positions corresponding to the hand touch positions according to the hand touch positions.
In the embodiment of the invention, the hand contour position is determined according to the hand touch position of the receiving end user, and a plurality of induction positions corresponding to the hand touch position are determined in the hand contour position. Generally, the number of the induction positions is 6, and the multiple induction positions corresponding to the hand touch positions are determined in the hand outline positions, so that a receiving end user can completely perceive the Braille characters, and the perception omission of the Braille characters can not be caused.
The arrangement mode of the 6 sensing positions is the same as that of the Braille characters, the arrangement mode comprises a left row and a right row, and each row is provided with three sensing positions.
And 305, sequentially displaying the braille characters at the plurality of induction positions by adjusting the vibration mode of the motor according to a preset rule.
In the embodiment of the invention, for a single Braille character, the vibration mode of the motor is adjusted point by point in the sequence from top to bottom and from left to right at a plurality of induction positions to display the Braille character in sequence until the single Braille character is completely displayed, and at the moment, the motor is in high-frequency vibration; when one Braille character is displayed, the motor is adjusted to vibrate at a low frequency to indicate that one Braille character is displayed, and a receiving end user is prompted to display the next Braille character.
And the receiving end senses the displayed text information at a plurality of touched induction positions according to the vibration of the motor.
In order to accurately sense the corresponding induction position by a receiving end user, the amplitudes and the frequencies of the motors are adjusted, so that the amplitude of the synthesized amplitudes of the motors is the largest at the corresponding induction position, and the sensed amplitude is the largest; at other positions than the sensing position, the resultant amplitude is small.
For example, for a character "b", the corresponding braille character is 1 dot and 2 dots, firstly, the motor is adjusted to be in high-frequency vibration at the induction position corresponding to 1 dot, at the moment, the amplitude sensed by the induction position corresponding to 1 dot is the largest, then, the motor is adjusted to be in high-frequency vibration at the induction position corresponding to 2 dot, at the moment, the amplitude sensed by the induction position corresponding to 2 dot is the largest, then, the motor is adjusted to be in low-frequency vibration, and a receiving end user is prompted to display the next braille character; when the next character is "c", the corresponding braille characters are 1 point and 4 points, the motor is adjusted to be in high-frequency vibration at the induction position corresponding to the 1 point, at the moment, the amplitude sensed by the induction position corresponding to the 1 point is the maximum, then the motor is adjusted to be in high-frequency vibration at the induction position corresponding to the 4 point, at the moment, the amplitude sensed by the induction position corresponding to the 4 point is the maximum.
In the embodiment of the invention, the text information sent by a sending end is received, the text information is converted into Braille characters, the hand touch position of a user is detected, a plurality of induction positions corresponding to the hand touch position are determined according to the hand touch position, and the Braille characters are sequentially displayed at the induction positions by adjusting the vibration mode of a motor according to a preset rule. The simulated display is carried out according to the text information sent by the sending end, and the blind user can sense the text information sent by the other side under the condition of inconvenient voice, so that effective communication can be carried out.
Example four
Referring to fig. 4, a flowchart of an information sensing method according to a fourth embodiment of the present invention is shown, which may specifically include the following steps:
step 401, receiving the odor information sent by the sending end.
In the embodiment of the invention, the mobile terminal at the sending end is provided with the smell sensor, when a user at the sending end operates on the mobile terminal at the sending end, the smell sensor receives the smell information of the environment where the user at the sending end is located, then the smell information is coded and sent to the mobile terminal at the receiving end, and the mobile terminal at the receiving end receives the smell information.
Specifically, smell information of the environment where a user at a sending end is located is received through the smell sensor, then the smell information is encoded to generate corresponding digital symbol information, the digital symbol information is sent to a mobile terminal at a receiving end through a network, the mobile terminal at the receiving end receives the digital symbol information, and correspondingly, the smell information sent by the sending end is also received.
Step 402, decoding the scent information.
In the embodiment of the invention, after the mobile terminal at the receiving end receives the smell information, the decoding operation is required to obtain the smell information of the environment where the user at the sending end is located.
Specifically, the mobile terminal at the receiving end actually receives the digital symbol information including the odor information, and decodes the digital symbol information to obtain the odor information of the environment where the user at the transmitting end is located.
And 403, performing simulation display on the decoded odor information through basic odor elements in the odor generator.
In the embodiment of the invention, the mobile terminal at the receiving end is provided with the taste generator, the taste generator comprises a plurality of basic smell elements, and different smell information can be synthesized by adjusting the types and the proportions of the basic smell elements. And performing simulation display on the decoded odor information through one or more basic odor elements in the odor generator, and restoring the odor information of the environment where the sending end user is located, so that the receiving end user can perceive the restored odor information.
In the embodiment of the invention, the odor information sent by the sending end is received, the odor information is decoded, and the decoded odor information is subjected to simulation display through basic odor elements in the odor generator. On the basis of current vision, auditory communication, increase sense of smell information, simulate the show according to the smell information that the sending end sent, and then the smell information of perception opposite side's environment, improved the authenticity of exchanging, richened the experience effect that the user exchanged.
EXAMPLE five
Referring to fig. 5, a block diagram of a mobile terminal according to a fifth embodiment of the present invention is shown.
The mobile terminal 500 includes: a receiving module 501 and a display module 502.
A receiving module 501, configured to receive first information sent by a sending end; the first information comprises one or more of touch information, character information and smell information;
a display module 502, configured to perform a simulation display according to the first information.
Referring to fig. 6, on the basis of fig. 5, when the first information includes touch information, the receiving module 501 further includes: the touch information receiving submodule 5011 is configured to receive first touch information and second touch information sent by a sending end; the first touch information comprises touch speed information acquired by a touch screen, and the second touch information comprises touch position information and touch pressure information acquired by a pressure sensor; and calculating the touch speed information according to the touch displacement information and the touch time information of the finger on the touch screen. The display module 502 further comprises: the sensing position first determining submodule 5021 is used for determining a corresponding sensing position according to the touch position information; the motor adjusting submodule 5022 is used for adjusting the amplitude and the frequency of the motor according to the distance between the induction position and the motor, the touch speed information and the touch pressure information; the first display submodule 5023 is used for performing simulation display on the motor at the induction position.
Referring to fig. 7, on the basis of fig. 5, when the first message includes a text message, the presentation module 502 further includes: the conversion module 5024 is used for converting the text information into braille characters; a touch position detection submodule 5025 for detecting a hand touch position of a user; the sensing position second determining submodule 5026 is used for determining a plurality of sensing positions corresponding to the hand touch positions according to the hand touch positions; and the second display submodule 5027 is used for sequentially displaying the braille characters at the plurality of induction positions according to a preset rule by adjusting the vibration mode of the motor.
Referring to fig. 8, on the basis of fig. 5, when the first information includes scent information, the display module 502 further includes: a decoding sub-module 5028 for decoding the scent information; a third displaying sub-module 5029, configured to display the decoded scent information in a simulated manner through basic scent elements in the scent generator.
In the embodiment of the invention, by receiving first information sent by a sending end, the first information comprises one or more of touch information, character information and smell information, and simulation display is carried out according to the first information. On the basis of current visual and auditory communication, touch and olfactory information is added, and simulation display is carried out according to touch information and odor information sent by a sending end, so that touch information of the limbs of the opposite side and odor information of the environment where the opposite side is located are sensed, the authenticity of communication is improved, and the experience effect of user communication is enriched; meanwhile, the simulated display is carried out according to the text information sent by the sending end, so that the blind user can sense the text information sent by the other side under the condition of inconvenient voice, and therefore effective communication can be carried out.
EXAMPLE six
Referring to fig. 9, a block diagram of a mobile terminal according to a sixth embodiment of the present invention is shown.
The mobile terminal 600 of the embodiment of the present invention includes: at least one processor 601, memory 602, at least one network interface 604, and other user interfaces 603. The various components in the mobile terminal 600 are coupled together by a bus system 605. It is understood that the bus system 605 is used to enable communications among the components. The bus system 605 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 605 in fig. 9.
The user interface 603 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, among others.
It will be appreciated that the memory 602 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable PROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of example, but not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous DRAM, SDRAM), Double Data rate Synchronous Dynamic random access memory (ddr SDRAM ), Enhanced Synchronous SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct memory bus RAM (DRRAM). The memory 602 of the subject systems and methods is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 602 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 6021 and application programs 6022.
The operating system 6021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application program 6022 includes various application programs such as a Media Player (Media Player), a Browser (Browser), and the like, and is used to implement various application services. A program implementing the method of an embodiment of the invention can be included in the application program 6022.
In this embodiment of the present invention, the processor 601 is configured to receive the first information sent by the sending end by calling a program or an instruction stored in the memory 602, specifically, a program or an instruction stored in the application program 6022; the first information comprises one or more of touch information, character information and smell information; and performing simulation display according to the first information. The method disclosed by the above-mentioned embodiment of the present invention can be applied to the processor 601, or implemented by the processor 601. The processor 601 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 601. The Processor 601 may be a general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 602, and the processor 601 reads the information in the memory 602 and completes the steps of the method in combination with the hardware thereof.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described in this disclosure may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described in this disclosure. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, when the first information includes touch information, the processor 601, when receiving the first information sent by the sending end, is further configured to: receiving first touch information and second touch information sent by a sending end; the first touch information comprises touch speed information acquired by a touch screen, and the second touch information comprises touch position information and touch pressure information acquired by a pressure sensor; and calculating the touch speed information according to the touch displacement information and the touch time information of the finger on the touch screen.
Optionally, when performing the simulated display according to the first information, the processor 601 is further configured to: determining a corresponding sensing position according to the touch position information; adjusting the amplitude and frequency of the motor according to the distance between the induction position and the motor, the touch speed information and the touch pressure information; and in the induction position, performing simulation display through the motor.
Optionally, when the first information includes text information, the processor 601 is further configured to, when performing the simulated display according to the first information: converting the text information into braille characters; detecting a hand touch position of a user; determining a plurality of sensing positions corresponding to the hand touch positions according to the hand touch positions; and displaying the braille characters at the plurality of induction positions in sequence by adjusting the vibration mode of the motor according to a preset rule.
Optionally, when the first information includes smell information, the processor 601, when performing the simulated display according to the first information, is further configured to: decoding the scent information; and performing simulation display on the decoded odor information through basic odor elements in the odor generator.
The mobile terminal 600 can implement each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
In the embodiment of the invention, by receiving first information sent by a sending end, the first information comprises one or more of touch information, character information and smell information, and simulation display is carried out according to the first information. On the basis of current visual and auditory communication, touch and olfactory information is added, and simulation display is carried out according to touch information and odor information sent by a sending end, so that touch information of the limbs of the opposite side and odor information of the environment where the opposite side is located are sensed, the authenticity of communication is improved, and the experience effect of user communication is enriched; meanwhile, the simulated display is carried out according to the text information sent by the sending end, so that the blind user can sense the text information sent by the other side under the condition of inconvenient voice, and therefore effective communication can be carried out.
EXAMPLE seven
Referring to fig. 10, a schematic structural diagram of a mobile terminal according to a seventh embodiment of the present invention is shown.
The mobile terminal of the embodiment of the invention can be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
The mobile terminal of fig. 10 includes a Radio Frequency (RF) circuit 710, a memory 720, an input unit 730, a display unit 740, a processor 760, an audio circuit 770, a wifi (wireless fidelity) module 780, and a power supply 790.
The input unit 730 may be used, among other things, to receive numeric or character information input by a user and to generate signal inputs related to user settings and function control of the mobile terminal. Specifically, in the embodiment of the present invention, the input unit 730 may include a touch panel 731. The touch panel 731, also referred to as a touch screen, can collect touch operations of a user (e.g. operations of the user on the touch panel 731 by using a finger, a stylus pen, or any other suitable object or accessory) thereon or nearby, and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 731 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 760, and can receive and execute commands sent from the processor 760. In addition, the touch panel 731 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 731, the input unit 730 may include other input devices 732, and the other input devices 732 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Among them, the display unit 740 may be used to display information input by a user or information provided to the user and various menu interfaces of the mobile terminal. The display unit 740 may include a display panel 741, and optionally, the display panel 741 may be configured in the form of an LCD or an Organic Light-Emitting Diode (OLED).
It should be noted that the touch panel 731 can cover the display panel 741 to form a touch display screen, and when the touch display screen detects a touch operation on or near the touch display screen, the touch display screen is transmitted to the processor 760 to determine the type of the touch event, and then the processor 760 provides a corresponding visual output on the touch display screen according to the type of the touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like.
The processor 760 is a control center of the mobile terminal, connects various parts of the entire mobile phone using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the first memory 721 and calling data stored in the second memory 722, thereby integrally monitoring the mobile terminal. Alternatively, processor 760 may include one or more processing units.
In the embodiment of the present invention, the processor 760 is configured to receive the first information sent by the sending end by calling the software program and/or module stored in the first memory 721 and/or the data stored in the second memory 722; the first information comprises one or more of touch information, character information and smell information; and performing simulation display according to the first information.
Optionally, when the first information includes touch information, the processor 760, when receiving the first information sent by the sending end, is further configured to: receiving first touch information and second touch information sent by a sending end; the first touch information comprises touch speed information acquired by a touch screen, and the second touch information comprises touch position information and touch pressure information acquired by a pressure sensor; and calculating the touch speed information according to the touch displacement information and the touch time information of the finger on the touch screen.
Optionally, the processor 760, when performing the simulated display according to the first information, is further configured to: determining a corresponding sensing position according to the touch position information; adjusting the amplitude and frequency of the motor according to the distance between the induction position and the motor, the touch speed information and the touch pressure information; and in the induction position, performing simulation display through the motor.
Optionally, when the first information includes text information, the processor 760 is further configured to, when performing the simulated display according to the first information: converting the text information into braille characters; detecting a hand touch position of a user; determining a plurality of sensing positions corresponding to the hand touch positions according to the hand touch positions; and displaying the braille characters at the plurality of induction positions in sequence by adjusting the vibration mode of the motor according to a preset rule.
Optionally, when the first information includes smell information, the processor 760, when performing the simulated display according to the first information, is further configured to: decoding the scent information; and performing simulation display on the decoded odor information through basic odor elements in the odor generator.
In the embodiment of the invention, by receiving the first information sent by the sending end, the first information comprises one or more of touch information, text information and smell information, and the simulation display is performed according to the first information. On the basis of current visual and auditory communication, touch and olfactory information is added, and simulation display is carried out according to touch information and odor information sent by a sending end, so that touch information of the limbs of the opposite side and odor information of the environment where the opposite side is located are sensed, the authenticity of communication is improved, and the experience effect of user communication is enriched; meanwhile, the simulated display is carried out according to the text information sent by the sending end, so that the blind user can sense the text information sent by the other side under the condition of inconvenient voice, and therefore effective communication can be carried out.
For the apparatus embodiment, since it is substantially similar to the method embodiment, the description is simpler than 20, and for the relevant points, reference may be made to the partial description of the method embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (8)

1. An information perception method is applied to a mobile terminal, and is characterized by comprising the following steps:
receiving first information sent by a sending end; the first information comprises one or more of touch information, character information and smell information;
performing simulation display according to the first information;
wherein, the first information includes text information, the step of performing simulated display according to the first information includes:
converting the text information into braille characters; detecting a hand touch position of a user; determining a plurality of sensing positions corresponding to the hand touch positions according to the hand touch positions;
sequentially displaying the braille characters at the plurality of induction positions according to a preset rule by adjusting the vibration mode of a motor;
wherein determining a plurality of sensing locations corresponding to the hand touch location based on the hand touch location comprises: determining a hand outline position according to the hand touch position, and determining 6 induction positions corresponding to the hand touch position and having the same arrangement mode as the arrangement of the Braille characters in the hand outline position;
be in according to preset rule a plurality of induction position, it is right in proper order through the vibrations mode of adjustment motor braille character demonstrates, include: confirming the corresponding induction positions of the currently displayed braille characters in the 6 induction positions, adjusting the vibration mode of the motor point by point according to the sequence from top to bottom and from left to right, and automatically displaying the induction positions corresponding to the currently displayed braille characters in sequence.
2. The method of claim 1, wherein when the first information comprises touch information, the step of receiving the first information sent by the sending end comprises:
receiving first touch information and second touch information sent by a sending end; the first touch information comprises touch speed information acquired by a touch screen, and the second touch information comprises touch position information and touch pressure information acquired by a pressure sensor;
and calculating the touch speed information according to the touch displacement information and the touch time information of the finger on the touch screen.
3. The method of claim 2, wherein the step of performing the simulated display according to the first information comprises:
determining a corresponding sensing position according to the touch position information;
adjusting the amplitude and frequency of the motor according to the distance between the induction position and the motor, the touch speed information and the touch pressure information;
and in the induction position, performing simulation display through the motor.
4. The method of claim 1, wherein when the first message includes scent information, the step of performing a simulated presentation based on the first message further comprises:
decoding the scent information;
and performing simulation display on the decoded odor information through basic odor elements in the odor generator.
5. A mobile terminal, comprising:
the receiving module is used for receiving first information sent by the sending end; the first information comprises one or more of touch information, character information and smell information;
the display module is used for performing simulation display according to the first information;
wherein, first information includes literal information, the show module includes:
the conversion submodule is used for converting the text information into braille characters;
the touch position detection submodule is used for detecting the hand touch position of a user;
the sensing position second determining submodule is used for determining a plurality of sensing positions corresponding to the hand touch positions according to the hand touch positions;
the second display submodule is used for sequentially displaying the braille characters at the plurality of induction positions by adjusting the vibration mode of the motor according to a preset rule;
the sensing position second determining submodule is specifically used for determining a hand outline position according to the hand touch position, and determining 6 sensing positions which correspond to the hand touch position and are arranged in the same manner as the arrangement of the braille characters in the hand outline position;
the second display submodule is specifically used for confirming the corresponding sensing positions of the currently displayed braille characters in the 6 sensing positions, adjusting the vibration mode of the motor point by point according to the sequence from top to bottom and from left to right, and automatically displaying the sensing positions corresponding to the currently displayed braille characters in sequence.
6. The mobile terminal of claim 5, wherein when the first information comprises touch information, the receiving module comprises:
the touch information receiving submodule is used for receiving the first touch information and the second touch information sent by the sending end; the first touch information comprises touch speed information acquired by a touch screen, and the second touch information comprises touch position information and touch pressure information acquired by a pressure sensor;
and calculating the touch speed information according to the touch displacement information and the touch time information of the finger on the touch screen.
7. The mobile terminal of claim 6, wherein the presentation module further comprises:
the sensing position first determining submodule is used for determining a corresponding sensing position according to the touch position information;
the motor adjusting submodule is used for adjusting the amplitude and the frequency of the motor according to the distance between the induction position and the motor, the touch speed information and the touch pressure information;
and the first display submodule is used for performing simulation display on the induction position through the motor.
8. The mobile terminal of claim 5, wherein when the first information comprises scent information, the presentation module further comprises:
the decoding submodule is used for decoding the smell information;
and the third display submodule is used for performing simulation display on the decoded odor information through basic odor elements in the odor generator.
CN201710423564.XA 2017-06-07 2017-06-07 Information perception method and mobile terminal Active CN107370660B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710423564.XA CN107370660B (en) 2017-06-07 2017-06-07 Information perception method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710423564.XA CN107370660B (en) 2017-06-07 2017-06-07 Information perception method and mobile terminal

Publications (2)

Publication Number Publication Date
CN107370660A CN107370660A (en) 2017-11-21
CN107370660B true CN107370660B (en) 2020-09-01

Family

ID=60306509

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710423564.XA Active CN107370660B (en) 2017-06-07 2017-06-07 Information perception method and mobile terminal

Country Status (1)

Country Link
CN (1) CN107370660B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108032302A (en) * 2017-12-22 2018-05-15 胡明建 The design method that a kind of computer vision tactile smell is mutually mapped with the time
CN107891448A (en) * 2017-12-25 2018-04-10 胡明建 The design method that a kind of computer vision sense of hearing tactile is mutually mapped with the time
CN109979286A (en) * 2019-02-19 2019-07-05 维沃移动通信有限公司 A kind of information demonstrating method and terminal device
CN113055422A (en) * 2019-12-27 2021-06-29 中兴通讯股份有限公司 Smell transmission method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1546290A (en) * 2003-12-01 2004-11-17 东南大学 Space-time double channel robot tactility rendition current stimulation method and apparatus
CN101521693A (en) * 2008-12-22 2009-09-02 康佳集团股份有限公司 Method and terminal for helping blind people read short messages
CN102262476A (en) * 2010-03-12 2011-11-30 美国博通公司 Tactile Communication System And Method
CN103297591A (en) * 2012-02-24 2013-09-11 联想(北京)有限公司 Scent delivery and emission method and device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101907922B (en) * 2009-06-04 2015-02-04 新励科技(深圳)有限公司 Touch and touch control system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1546290A (en) * 2003-12-01 2004-11-17 东南大学 Space-time double channel robot tactility rendition current stimulation method and apparatus
CN101521693A (en) * 2008-12-22 2009-09-02 康佳集团股份有限公司 Method and terminal for helping blind people read short messages
CN102262476A (en) * 2010-03-12 2011-11-30 美国博通公司 Tactile Communication System And Method
CN103297591A (en) * 2012-02-24 2013-09-11 联想(北京)有限公司 Scent delivery and emission method and device

Also Published As

Publication number Publication date
CN107370660A (en) 2017-11-21

Similar Documents

Publication Publication Date Title
CN107370660B (en) Information perception method and mobile terminal
US8766786B2 (en) Device and method for providing tactile information
US9958944B2 (en) Encoding dynamic haptic effects
US9030428B2 (en) Generating haptic effects for dynamic events
JP6479148B2 (en) Enhanced dynamic haptic effect
CN107562345B (en) Information storage method and mobile terminal
EP2165251A1 (en) Method, apparatus and computer program product for providing a scrolling mechanism for touch screen devices
CN106168894B (en) Content display method and mobile terminal
US9329686B2 (en) Haptic feedback method, haptic feedback apparatus, electronic device and stylus
EP2075671A1 (en) User interface of portable device and operating method thereof
US20190324539A1 (en) Systems and methods for providing dynamic haptic playback for an augmented or virtual reality environments
CN106775378B (en) Method for determining candidate words of input method and mobile terminal
CN108073380B (en) Electronic device, display control method and related product
JPWO2016038675A1 (en) Tactile sensation control system and tactile sensation control method
CN106407027B (en) Information display method of mobile terminal and mobile terminal
KR20110076283A (en) Method and apparatus for providing feedback according to user input patten
CN107728898B (en) Information processing method and mobile terminal
WO2020258074A1 (en) Method and device for generating haptic feedback
CN111444494B (en) Verification method, electronic device and computer readable storage medium
KR20110055096A (en) Apparatus and method for setting stereoscopic effect in a portable terminal
CN107294571B (en) Data transmission method, base station and mobile terminal
CN111443859A (en) Touch interaction method and electronic equipment
JP6483379B2 (en) Tactile sensation control system and tactile sensation control method
JP6314715B2 (en) Information processing device
KR101426791B1 (en) Apparatas and method of inputting selected symbol for detecting input gesture in a electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant