CN108268835A - sign language interpretation method, mobile terminal and computer readable storage medium - Google Patents

sign language interpretation method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN108268835A
CN108268835A CN201711459455.XA CN201711459455A CN108268835A CN 108268835 A CN108268835 A CN 108268835A CN 201711459455 A CN201711459455 A CN 201711459455A CN 108268835 A CN108268835 A CN 108268835A
Authority
CN
China
Prior art keywords
sign language
mobile terminal
voice
action
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711459455.XA
Other languages
Chinese (zh)
Inventor
张佳博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201711459455.XA priority Critical patent/CN108268835A/en
Publication of CN108268835A publication Critical patent/CN108268835A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/60Information retrieval; Database structures therefor; File system structures therefor of audio data

Abstract

The invention discloses a kind of sign language interpretation method, this method includes:Acquire the gesture motion of user;Standard sign language action is matched from database according to the gesture motion;The standard sign language is searched from the database and acts corresponding voice;The found voice of output.The embodiment of the invention also discloses a kind of mobile terminal and computer readable storage mediums.Translation thereby, it is possible to be realized by the mobile terminal between sign language and voice, word is converted, and facilitates the communication exchange between deaf-mute and normal person.

Description

Sign language interpretation method, mobile terminal and computer readable storage medium
Technical field
The present invention relates to image and technical field of voice recognition more particularly to a kind of sign language interpretation method, mobile terminal and Computer readable storage medium.
Background technology
In real life, deaf-mute is generally linked up by sign language.However, when deaf-mute needs to link up with normal person During exchange, since most of normal persons can not understand sign language, it is inconvenient to often result in communication.It, can be in existing mobile terminal It realizes the translation conversion between voice and word, but does not temporarily occur the mobile terminal product that can translate sign language.
Invention content
It is a primary object of the present invention to propose a kind of sign language interpretation method and corresponding mobile terminal, it is intended to which how is solution Realize the problem of translation between sign language and voice, word is converted.
To achieve the above object, a kind of sign language interpretation method provided by the invention, applied to mobile terminal, this method includes Step:
Acquire the gesture motion of user;
Standard sign language action is matched from database according to the gesture motion;
The standard sign language is searched from the database and acts corresponding voice;And
The found voice of output.
Optionally, this method further includes step:
The voice transfer found to another mobile terminal is exported.
Optionally, this method is gone back after the step of matching standard sign language action from database according to the gesture motion Including step:
The standard sign language is searched from the database and acts corresponding word;
It exports found word or the word found is sent to another mobile terminal and export.
Optionally, the step of gesture motion of the acquisition user specifically includes:
Shoot images of gestures;
Action segment is carried out to the images of gestures to divide;
Gesture motion is identified from each action segment;
Judgement amendment is carried out to the gesture motion identified.
Optionally, in the step of action segment divides is carried out to the images of gestures, partitioning standards can be that basis is set The time interval put is divided, and after previous action segment is marked off, and the initial time of latter action segment is returned Move back preset time period.
Optionally, this method further includes step:
Acquire voice input by user;
The corresponding sign language action of the voice is searched from the database;
The found sign language action of output.
Optionally, this method is also wrapped after the step of corresponding sign language of the voice acts is searched from the database Include step:
The sign language found action is sent to another mobile terminal to export.
Optionally, in the step of corresponding sign language of the voice acts is searched from the database, the database In preserve various sign languages action and corresponding word, after the voice is collected, first pass through speech recognition technology will described in Voice is converted to corresponding word, then according to the word being converted to, is searched, obtained corresponding from the database Sign language acts.
In addition, to achieve the above object, the present invention also proposes a kind of mobile terminal, and the mobile terminal includes:Memory, It processor, voice collecting unit, image acquisition units, voice playing unit, screen and is stored on the memory and can be The sign language interpreter program run on the processor is realized as described above when the sign language interpreter program is performed by the processor The step of sign language interpretation method.
Further, to achieve the above object, the present invention also provides a kind of computer readable storage medium, the computers Sign language interpreter program is stored on readable storage medium storing program for executing, the sign language interpreter program realizes hand as described above when being executed by processor The step of language interpretation method.
Sign language interpretation method proposed by the present invention, mobile terminal and computer readable storage medium, can pass through the shifting Dynamic terminal realizes that the translation between sign language and voice, word is converted, and facilitates the communication exchange between deaf-mute and normal person.
Description of the drawings
The hardware architecture diagram of Fig. 1 mobile terminals of each embodiment to realize the present invention;
Fig. 2 is the wireless communication system schematic diagram of mobile terminal as shown in Figure 1;
Fig. 3 is a kind of flow chart for sign language interpretation method that first embodiment of the invention proposes;
Fig. 4 is the refined flow chart of step S300 in Fig. 3;
Fig. 5 is the schematic diagram that images of gestures is acquired in the present invention;
Fig. 6 is a kind of flow chart for sign language interpretation method that second embodiment of the invention proposes;
Fig. 7 is a kind of flow chart for sign language interpretation method that third embodiment of the invention proposes;
Fig. 8 is a kind of flow chart for sign language interpretation method that fourth embodiment of the invention proposes;
Fig. 9 is the schematic diagram that sign language image is shown in the present invention;
Figure 10 is a kind of flow chart for sign language interpretation method that fifth embodiment of the invention proposes;
Figure 11 is a kind of module diagram for mobile terminal that sixth embodiment of the invention proposes;
Figure 12 is the module diagram of a kind of sign language interpretation system that the 7th to 11 embodiments of the invention propose.
The embodiments will be further described with reference to the accompanying drawings for the realization, the function and the advantages of the object of the present invention.
Specific embodiment
It should be appreciated that the specific embodiments described herein are merely illustrative of the present invention, it is not intended to limit the present invention.
In subsequent description, using for representing that the suffix of such as " module ", " component " or " unit " of element is only Be conducive to the explanation of the present invention, itself there is no a specific meaning.Therefore, " module ", " component " or " unit " can mix Ground uses.
Terminal can be implemented in a variety of manners.For example, terminal described in the present invention can include such as mobile phone, tablet Computer, laptop, palm PC, personal digital assistant (Personal Digital Assistant, PDA), portable The shiftings such as media player (Portable Media Player, PMP), navigation device, wearable device, Intelligent bracelet, pedometer The dynamic fixed terminals such as terminal and number TV, desktop computer.
It will be illustrated by taking mobile terminal as an example in subsequent descriptions, it will be appreciated by those skilled in the art that in addition to special For moving except the element of purpose, construction according to the embodiment of the present invention can also apply to the terminal of fixed type.
Referring to Fig. 1, a kind of hardware architecture diagram of its mobile terminal of each embodiment to realize the present invention, the shifting Dynamic terminal 100 can include:RF (Radio Frequency, radio frequency) unit 101, WiFi module 102, audio output unit 103rd, A/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108th, the components such as memory 109, processor 110 and power supply 111.It will be understood by those skilled in the art that shown in Fig. 1 Mobile terminal structure does not form the restriction to mobile terminal, and mobile terminal can be included than illustrating more or fewer components, Either combine certain components or different components arrangement.
The all parts of mobile terminal are specifically introduced with reference to Fig. 1:
Radio frequency unit 101 can be used for receive and send messages or communication process in, signal sends and receivees, specifically, by base station Downlink information receive after, handled to processor 110;In addition, the data of uplink are sent to base station.In general, radio frequency unit 101 Including but not limited to antenna, at least one amplifier, transceiver, coupler, low-noise amplifier, duplexer etc..In addition, it penetrates Frequency unit 101 can also communicate with network and other equipment by radio communication.Above-mentioned wireless communication can use any communication Standard or agreement, including but not limited to GSM (Global System of Mobile communication, global system for mobile telecommunications System), GPRS (General Packet Radio Service, general packet radio service), CDMA2000 (Code Division MultipleAccess 2000, CDMA 2000), WCDMA (Wideband Code Division Multiple Access, wideband code division multiple access), TD-SCDMA (Time Division-Synchronous Code Division MultipleAccess, TD SDMA), FDD-LTE (Frequency Division Duplexing-Long Term Evolution, frequency division duplex long term evolution) and TDD-LTE (Time Division Duplexing-Long Term Evolution, time division duplex long term evolution) etc..
WiFi belongs to short range wireless transmission technology, and mobile terminal can help user to receive and dispatch electricity by WiFi module 102 Sub- mail, browsing webpage and access streaming video etc., it has provided wireless broadband internet to the user and has accessed.Although Fig. 1 shows Go out WiFi module 102, but it is understood that, and must be configured into for mobile terminal is not belonging to, it completely can be according to need It to be omitted in the range for the essence for not changing invention.
Audio output unit 103 can be in call signal reception pattern, call mode, record mould in mobile terminal 100 Formula, speech recognition mode, broadcast reception mode when under isotypes, it is that radio frequency unit 101 or WiFi module 102 are received or The audio data stored in memory 109 is converted into audio signal and exports as sound.Moreover, audio output unit 103 The relevant audio output of specific function performed with mobile terminal 100 can also be provided (for example, call signal receives sound, disappears Breath receives sound etc.).Audio output unit 103 can include loud speaker, buzzer etc..
A/V input units 104 are used to receive audio or video signal.A/V input units 104 can include graphics processor (Graphics Processing Unit, GPU) 1041 and microphone 1042, graphics processor 1041 is in video acquisition mode Or the static images or the image data of video obtained in image capture mode by image capture apparatus (such as camera) carry out Reason.Treated, and picture frame may be displayed on display unit 106.Through graphics processor 1041, treated that picture frame can be deposited Storage is sent in memory 109 (or other storage mediums) or via radio frequency unit 101 or WiFi module 102.Mike Wind 1042 can connect in telephone calling model, logging mode, speech recognition mode etc. operational mode via microphone 1042 Quiet down sound (audio data), and can be audio data by such acoustic processing.Audio that treated (voice) data can To be converted to the form output that mobile communication base station can be sent to via radio frequency unit 101 in the case of telephone calling model. Microphone 1042 can implement various types of noises elimination (or inhibition) algorithms and send and receive sound to eliminate (or inhibition) The noise generated during frequency signal or interference.
Mobile terminal 100 further includes at least one sensor 105, such as optical sensor, motion sensor and other biographies Sensor.Specifically, optical sensor includes ambient light sensor and proximity sensor, wherein, ambient light sensor can be according to environment The light and shade of light adjusts the brightness of display panel 1061, and proximity sensor can close when mobile terminal 100 is moved in one's ear Display panel 1061 and/or backlight.As one kind of motion sensor, accelerometer sensor can detect in all directions (general For three axis) size of acceleration, size and the direction of gravity are can detect that when static, can be used to identify the application of mobile phone posture (such as horizontal/vertical screen switching, dependent game, magnetometer pose calibrating), Vibration identification correlation function (such as pedometer, percussion) etc.; The fingerprint sensor that can also configure as mobile phone, pressure sensor, iris sensor, molecule sensor, gyroscope, barometer, The other sensors such as hygrometer, thermometer, infrared ray sensor, details are not described herein.
Display unit 106 is used to show by information input by user or be supplied to the information of user.Display unit 106 can wrap Display panel 1061 is included, liquid crystal display (Liquid Crystal Display, LCD), Organic Light Emitting Diode may be used Display panel 1061 is configured in forms such as (Organic Light-Emitting Diode, OLED).
User input unit 107 can be used for receiving the number inputted or character information and generation and the use of mobile terminal The key signals input that family is set and function control is related.Specifically, user input unit 107 may include touch panel 1071 with And other input equipments 1072.Touch panel 1071, also referred to as touch screen collect user on it or neighbouring touch operation (for example user uses any suitable objects such as finger, stylus or attachment on touch panel 1071 or in touch panel 1071 Neighbouring operation), and corresponding attachment device is driven according to preset formula.Touch panel 1071 may include touch detection Two parts of device and touch controller.Wherein, the touch orientation of touch detecting apparatus detection user, and detect touch operation band The signal come, transmits a signal to touch controller;Touch controller receives touch information from touch detecting apparatus, and by it Contact coordinate is converted into, then gives processor 110, and the order that processor 110 is sent can be received and performed.It in addition, can To realize touch panel 1071 using multiple types such as resistance-type, condenser type, infrared ray and surface acoustic waves.In addition to touch panel 1071, user input unit 107 can also include other input equipments 1072.Specifically, other input equipments 1072 can wrap It includes but is not limited to physical keyboard, in function key (such as volume control button, switch key etc.), trace ball, mouse, operating lever etc. It is one or more, do not limit herein specifically.
Further, touch panel 1071 can cover display panel 1061, when touch panel 1071 detect on it or After neighbouring touch operation, processor 110 is sent to determine the type of touch event, is followed by subsequent processing device 110 according to touch thing The type of part provides corresponding visual output on display panel 1061.Although in Fig. 1, touch panel 1071 and display panel 1061 be the component independent as two to realize the function that outputs and inputs of mobile terminal, but in certain embodiments, it can The function that outputs and inputs of mobile terminal is realized so that touch panel 1071 and display panel 1061 is integrated, is not done herein specifically It limits.
Interface unit 108 be used as at least one external device (ED) connect with mobile terminal 100 can by interface.For example, External device (ED) can include wired or wireless head-band earphone port, external power supply (or battery charger) port, wired or nothing Line data port, memory card port, the port for device of the connection with identification module, audio input/output (I/O) end Mouth, video i/o port, ear port etc..Interface unit 108 can be used for receiving the input from external device (ED) (for example, number It is believed that breath, electric power etc.) and the input received is transferred to one or more elements in mobile terminal 100 or can be with For transmitting data between mobile terminal 100 and external device (ED).
Memory 109 can be used for storage software program and various data.Memory 109 can mainly include storing program area And storage data field, wherein, storing program area can storage program area, application program (such as the sound needed at least one function Sound playing function, image player function etc.) etc.;Storage data field can store according to mobile phone use created data (such as Audio data, phone directory etc.) etc..In addition, memory 109 can include high-speed random access memory, can also include non-easy The property lost memory, a for example, at least disk memory, flush memory device or other volatile solid-state parts.
Processor 110 is the control centre of mobile terminal, utilizes each of various interfaces and the entire mobile terminal of connection A part is stored in storage by running or performing the software program being stored in memory 109 and/or module and call Data in device 109 perform the various functions of mobile terminal and processing data, so as to carry out integral monitoring to mobile terminal.Place Reason device 110 may include one or more processing units;Preferably, processor 110 can integrate application processor and modulatedemodulate is mediated Device is managed, wherein, the main processing operation system of application processor, user interface and application program etc., modem processor is main Processing wireless communication.It is understood that above-mentioned modem processor can not also be integrated into processor 110.
Mobile terminal 100 can also include the power supply 111 (such as battery) powered to all parts, it is preferred that power supply 111 Can be logically contiguous by power-supply management system and processor 110, so as to realize management charging by power-supply management system, put The functions such as electricity and power managed.
Although Fig. 1 is not shown, mobile terminal 100 can also be including bluetooth module etc., and details are not described herein.
For the ease of understanding the embodiment of the present invention, below to the communications network system that is based on of mobile terminal of the present invention into Row description.
Referring to Fig. 2, Fig. 2 is a kind of communications network system Organization Chart provided in an embodiment of the present invention, the communication network system The LTE system united as universal mobile communications technology, the LTE system include the UE (User Equipment, the use that communicate connection successively Family equipment) 201, E-UTRAN (Evolved UMTS Terrestrial Radio Access Network, evolved UMTS lands Ground wireless access network) 202, EPC (Evolved Packet Core, evolved packet-based core networks) 203 and operator IP operation 204。
Specifically, UE201 can be above-mentioned terminal 100, and details are not described herein again.
E-UTRAN202 includes eNodeB2021 and other eNodeB2022 etc..Wherein, eNodeB2021 can be by returning Journey (backhaul) (such as X2 interface) is connect with other eNodeB2022, and eNodeB2021 is connected to EPC203, ENodeB2021 can provide the access of UE201 to EPC203.
EPC203 can include MME (Mobility Management Entity, mobility management entity) 2031, HSS (Home Subscriber Server, home subscriber server) 2032, other MME2033, SGW (Serving Gate Way, Gateway) 2034, PGW (PDN Gate Way, grouped data network gateway) 2035 and PCRF (Policy and Charging Rules Function, policy and rate functional entity) 2036 etc..Wherein, MME2031 be processing UE201 and The control node of signaling, provides carrying and connection management between EPC203.HSS2032 is all to manage for providing some registers Such as the function of home location register (not shown) etc, and some are preserved in relation to use such as service features, data rates The dedicated information in family.All customer data can be sent by SGW2034, and PGW2035 can provide the IP of UE 201 Address is distributed and other functions, and PCRF2036 is business data flow and the strategy of IP bearing resources and charging control strategic decision-making Point, it selects and provides available strategy and charging control decision with charge execution function unit (not shown) for strategy.
IP operation 204 can include internet, Intranet, IMS (IP Multimedia Subsystem, IP multimedia System) or other IP operations etc..
Although above-mentioned be described by taking LTE system as an example, those skilled in the art it is to be understood that the present invention not only Suitable for LTE system, be readily applicable to other wireless communication systems, such as GSM, CDMA2000, WCDMA, TD-SCDMA with And following new network system etc., it does not limit herein.
Based on above-mentioned mobile terminal hardware configuration and communications network system, each embodiment of the method for the present invention is proposed.
A kind of sign language interpretation method proposed by the present invention, for being translated as voice according to the images of gestures of user or word is defeated Go out or exported according to voiced translation input by user for sign language.
Embodiment one
As shown in figure 3, first embodiment of the invention proposes a kind of sign language interpretation method, this method includes the following steps:
S300 acquires the gesture motion of user.
Specifically, deaf-mute needs to be linked up with other people by sign language gesture, at this point, the mobile terminal 2 passes through Described image collecting unit 24 acquires the images of gestures of user, therefrom identifies gesture motion.
As shown in fig.4, the refined flow chart for the step S300.In the present embodiment, the step S300 is specific Including:
S3000 shoots images of gestures.
Specifically, when user makes sign language gesture, user is shot by the image acquisition units 24 of the mobile terminal 2 Images of gestures (refering to shown in Fig. 5).The images of gestures is one section of video.
S3002 carries out the images of gestures action segment and divides.
Specifically, the action segment division refers to a series of gesture motion in one section of video being divided into multiple actions Segment, partitioning standards can be divided according to the time interval of setting.In order to ensure that the gesture motion marked off does not have It omits, smaller time interval can be set.Also, after previous action segment is marked off, latter action segment rises Time of beginning can suitably retract preset time period, ensure there is overlapping between former and later two action segments, avoid some gesture motion It is cut into two action segments, causes to omit.
S3004 identifies gesture motion from each action segment.
Specifically, it after the completion of the images of gestures is divided, for each action segment, therefrom identifies effective Gesture motion identifies each complete gesture motion, remove that segment beginning and end is likely to occur it is cut not Complete gesture motion.
S3006 carries out judgement amendment to the gesture motion identified.
Specifically, it is described judge to correct refer to that standard sign language that the gesture motion of user may preserve with database is acted and had It comes in and goes out, including movement range, therefore, the gesture motion identified from the action segment can be modified first, with Preferably matched with the standard sign language action in the database.
Fig. 3, step S302 are returned to, matches standard sign language action from database according to the gesture motion.
Specifically, by the gesture motion with preserved in the database standard sign language action compare, find out with The standard sign language action that the gesture motion matches.For example, by the gesture motion acquired in Fig. 5 with it is each in the database A standard sign language action is compared, and finds the standard sign language action of matched expression " thanks ".
S304 searches the standard sign language from the database and acts corresponding voice.
Specifically, each standard sign language is also preserved in the database and acts corresponding voice data.It is described when finding After the matched standard sign language action of gesture motion, the standard sign language is further searched from the database and acts corresponding language Sound, so as to which the gesture motion is translated as the voice.For example, it is acted according to the standard sign language for representing " thanks ", from described Voice " thanks " is found in database.
S306, the found voice of output.
Specifically, the mobile terminal 2 is played by voice playing unit 25 and is found according to standard sign language action Voice, so as to which the user for being ignorant of sign language be made to be appreciated that the meaning of the gesture motion.
In the present embodiment, the sign language interpretation method can be applied to deaf-mute user and make their own gesture motion, turn over The scene that other people listen is played to after translating voice, normal person user can also be applied to and shoot the hand of deaf-mute for needing to link up Gesture acts, and the scene oneself listened is played to after translating voice.It, can be non-by the mobile terminal 2 under both scenes Often conveniently realize the communication exchange between deaf-mute and normal person.
Embodiment two
As shown in fig. 6, second embodiment of the invention proposes a kind of sign language interpretation method.In a second embodiment, the hand The step S600-S604 of language interpretation method and the step S300-S304 of first embodiment are similar, are also wrapped difference lies in this method Include step S606.
This method includes the following steps:
S600 acquires the gesture motion of user.
Specifically, deaf-mute needs to be linked up with other people by sign language gesture, at this point, the mobile terminal 2 passes through Described image collecting unit 24 acquires the images of gestures of user, therefrom identifies gesture motion.The specific refinement flow of the step Refering to Fig. 4 and the first embodiment, details are not described herein.
S602 matches standard sign language action according to the gesture motion from database.
Specifically, by the gesture motion with preserved in the database standard sign language action compare, find out with The standard sign language action that the gesture motion matches.For example, by the gesture motion acquired in Fig. 5 with it is each in the database A standard sign language action is compared, and finds the standard sign language action of matched expression " thanks ".
S604 searches the standard sign language from the database and acts corresponding voice.
Specifically, each standard sign language is also preserved in the database and acts corresponding voice data.It is described when finding After the matched standard sign language action of gesture motion, the standard sign language is further searched from the database and acts corresponding language Sound, so as to which the gesture motion is translated as the voice.For example, it is acted according to the standard sign language for representing " thanks ", from described Voice " thanks " is found in database.
S606 exports the voice transfer found to another mobile terminal 2.
Specifically, the mobile terminal 2 (first movement terminal) will be acted according to the standard sign language by network 6 and be searched The voice transfer arrived is to another mobile terminal 2 (the second mobile terminal), so that another described (second movement of mobile terminal 2 Terminal) voice can be played, so as to manage the corresponding user of another mobile terminal 2 (the second mobile terminal) Solve the meaning of the gesture motion.
In the present embodiment, the sign language interpretation method can be applied to (first movement terminal is corresponding) deaf-mute user Gesture motion is made their own, translates and plays to other after being sent to another mobile terminal 2 (the second mobile terminal) after voice The scene that people listens, so as to easily realize the communication exchange between deaf-mute and normal person.
It is worth noting that, in other embodiments, the mobile terminal 2 (first movement terminal) can also collect The gesture motion is directly sent to another described mobile terminal 2 (the second mobile terminal) after the gesture motion, by institute It states another mobile terminal 2 (the second mobile terminal) and performs the step S602 and step S604, then play the language translated Sound.
Embodiment three
As shown in fig. 7, third embodiment of the invention proposes a kind of sign language interpretation method.In the third embodiment, the hand The step of language interpretation method and the first embodiment and second embodiment are similar, can also be by described in difference lies in this method Gesture motion is translated as word.
This method includes the following steps:
S700 acquires the gesture motion of user.
Specifically, deaf-mute needs to be linked up with other people by sign language gesture, at this point, the mobile terminal 2 passes through Described image collecting unit 24 acquires the images of gestures of user, therefrom identifies gesture motion.The specific refinement flow of the step Refering to Fig. 4 and the first embodiment, details are not described herein.
S702 matches standard sign language action according to the gesture motion from database.
Specifically, by the gesture motion with preserved in the database standard sign language action compare, find out with The standard sign language action that the gesture motion matches.For example, by the gesture motion acquired in Fig. 5 with it is each in the database A standard sign language action is compared, and finds the standard sign language action of matched expression " thanks ".
S704 searches the standard sign language from the database and acts corresponding word.
Specifically, each standard sign language is also preserved in the database and acts corresponding lteral data.It is described when finding After the matched standard sign language action of gesture motion, the standard sign language is further searched from the database and acts corresponding text Word, so as to which the gesture motion is translated as the word.For example, it is acted according to the standard sign language for representing " thanks ", from described Corresponding word " thanks " is found in database.
S706, the found word of output.
Specifically, the mobile terminal 2 shows the word found according to standard sign language action by screen 26, from And the user for being ignorant of sign language is made to be appreciated that the meaning of the gesture motion.
In the present embodiment, the sign language interpretation method can be applied to deaf-mute user and make their own gesture motion, turn over The scene that other people see is shown to after translating word, normal person user can also be applied to and shoot the hand of deaf-mute for needing to link up Gesture acts, and the scene oneself seen is shown to after translating word.It, can be non-by the mobile terminal 2 under both scenes Often conveniently realize the communication exchange between deaf-mute and normal person.
Further, the step S706 can also be replaced with:The word found is sent to another movement eventually End 2 is exported.
Specifically, the mobile terminal 2 (first movement terminal) will be acted according to the standard sign language by network 6 and be searched To word be sent to another mobile terminal 2 (the second mobile terminal) so that (the second movement of another described mobile terminal 2 Terminal) it can show the word, so as to manage the corresponding user of another mobile terminal 2 (the second mobile terminal) Solve the meaning of the gesture motion.
In the present embodiment, the sign language interpretation method can be applied to (first movement terminal is corresponding) deaf-mute user Gesture motion is made their own, translates and is shown to other after being sent to another mobile terminal 2 (the second mobile terminal) after word The scene that people sees, so as to easily realize the communication exchange between deaf-mute and normal person.
It is worth noting that, in other embodiments, the mobile terminal 2 (first movement terminal) can also collect The gesture motion is directly sent to another described mobile terminal 2 (the second mobile terminal) after the gesture motion, by institute It states another mobile terminal 2 (the second mobile terminal) and performs the step S702 and step S704, then show the text translated Word.
Example IV
As shown in figure 8, fourth embodiment of the invention proposes a kind of sign language interpretation method, this method includes the following steps:
S800 acquires voice input by user.
Specifically, when normal person needs to link up with deaf-mute, due to being ignorant of sign language, the mobile terminal can be first passed through Then 2 input voices are translated as corresponding sign language by the mobile terminal 2 and act.At this point, the mobile terminal 2 passes through voice Collecting unit 23 acquires voice input by user.For example, collect " thanks " this section of voice that user says.
S802 searches the corresponding sign language action of the voice from database.
Specifically, various sign language actions and corresponding meaning (can be that written form represents) are preserved in database.When After collecting voice input by user, first pass through speech recognition technology and the voice is converted into corresponding word, then basis The word being converted to is searched from the database, obtains corresponding sign language action.For example, when collected described When voice is " thanks ", word " thanks " is converted to by speech recognition technology first, is then searched from the database Represented the sign language action of " thanks ".
S804, the found sign language action of output.
Specifically, the mobile terminal 2 is played by screen 26 and is acted according to the sign language that the voice is found, so as to make Deaf-mute is appreciated that the meaning of the voice.For example, the sign language action for representing " thanks " is shown in institute by the mobile terminal 2 (refering to shown in Fig. 9) is stated in screen 26, then deaf-mute can understand that the voice is meant that by watching sign language action It says " thanks ".
In the present embodiment, the sign language interpretation method can be applied to normal person user oneself and input voice, translate The scene that deaf-mute sees is played to after sign language action, deaf-mute user can also be applied to and acquire the language of normal person for needing to link up Sound plays to the scene oneself seen after translating sign language action.It, can be non-by the mobile terminal 2 under both scenes Often conveniently realize the communication exchange between deaf-mute and normal person.
Embodiment five
As shown in Figure 10, fifth embodiment of the invention proposes a kind of sign language interpretation method.In the 5th embodiment, the hand The step S900-S902 of language interpretation method and the step S800-S802 of fourth embodiment are similar, are also wrapped difference lies in this method Include step S904.
This method includes the following steps:
S900 acquires voice input by user.
Specifically, when normal person needs to link up with deaf-mute, due to being ignorant of sign language, the mobile terminal can be first passed through Then 2 input voices are translated as corresponding sign language by the mobile terminal 2 and act.At this point, the mobile terminal 2 passes through voice Collecting unit 23 acquires voice input by user.For example, collect " thanks " this section of voice that user says.
S902 searches the corresponding sign language action of the voice from database.
Specifically, various sign language actions and corresponding meaning (can be that written form represents) are preserved in database.When After collecting voice input by user, first pass through speech recognition technology and the voice is converted into corresponding word, then basis The word being converted to is searched from the database, obtains corresponding sign language action.For example, when collected described When voice is " thanks ", word " thanks " is converted to by speech recognition technology first, is then searched from the database Represented the sign language action of " thanks ".
The sign language found action is sent to another mobile terminal 2 and exported by S904.
Specifically, the sign language that the mobile terminal 2 (first movement terminal) will be found by network 6 according to the voice Action is sent to another mobile terminal 2 (the second mobile terminal), so that another described mobile terminal 2 (the second mobile terminal) The sign language action can be played, so as to manage the corresponding user of another mobile terminal 2 (the second mobile terminal) Solve the meaning of the voice.
In the present embodiment, the sign language interpretation method can be applied to (first movement terminal is corresponding) normal person user Oneself input voice, translate be sent to after another mobile terminal 2 (the second mobile terminal) after sign language action play to it is deaf and dumb The scene that people sees, so as to easily realize the communication exchange between deaf-mute and normal person.
It is worth noting that, in other embodiments, the mobile terminal 2 (first movement terminal) can also collect Directly by the voice transfer to another described mobile terminal 2 (the second mobile terminal) after the voice, by it is described another Mobile terminal 2 (the second mobile terminal) performs the step S902, then plays the sign language action translated.
The present invention further provides a kind of mobile terminal, the mobile terminal includes memory, processor, voice collecting list Member, image acquisition units, voice playing unit, screen and sign language interpretation system.The sign language interpretation system is used for according to user Images of gestures be translated as voice or word output or according to voiced translation input by user for sign language output.
Embodiment six
As shown in figure 11, sixth embodiment of the invention proposes a kind of mobile terminal 2.The mobile terminal 2 includes memory 20th, processor 22, voice collecting unit 23, image acquisition units 24, voice playing unit 25, screen 26 and sign language interpretation system 28。
Wherein, the memory 20 includes at least a type of readable storage medium storing program for executing, and the shifting is installed on for storing The dynamic operating system of terminal 2 and types of applications software, such as program code of sign language interpretation system 28 etc..In addition, the storage Device 20 can be also used for temporarily storing the Various types of data that has exported or will export.
The processor 22 can be in some embodiments central processing unit (Central Processing Unit, CPU), controller, microcontroller, microprocessor or other data processing chips.The processor 22 is commonly used in the control shifting The overall operation of dynamic terminal 2.In the present embodiment, the processor 22 is used to run the program code stored in the memory 20 Or processing data, such as run described sign language interpretation system 28 etc..
The voice collecting unit 23 (such as microphone) is for acquiring the voice of user.
Described image collecting unit 24 (such as camera) is for acquiring the images of gestures of user.
The voice that the voice playing unit 25 (such as loud speaker) obtains after being translated for broadcasting.
The screen 26 is for the sign language action obtained after display translation or word.
Embodiment seven
As shown in figure 12, seventh embodiment of the invention proposes a kind of sign language interpretation system 28.In the present embodiment, the hand Language translation system 28 includes:
Acquisition module 800, for acquiring the gesture motion of user.
Specifically, deaf-mute needs to be linked up with other people by sign language gesture, at this point, the acquisition module 800 is logical The images of gestures that described image collecting unit 24 acquires user is crossed, therefrom identifies gesture motion.Specifically, the acquisition is used The process of the gesture motion at family includes:
(1) images of gestures is shot.
Specifically, when user makes sign language gesture, user is shot by the image acquisition units 24 of the mobile terminal 2 Images of gestures (refering to shown in Fig. 5).The images of gestures is one section of video.
(2) action segment is carried out to the images of gestures to divide.
Specifically, the action segment division refers to a series of gesture motion in one section of video being divided into multiple actions Segment, partitioning standards can be divided according to time interval.It, can in order to ensure that the gesture motion marked off does not have omission To set smaller time interval.Also, after previous action segment is marked off, the initial time of latter action segment can Suitably to retract, ensure there is overlapping between former and later two action segments, some gesture motion is avoided to be cut into two action movies Section, causes to omit.
(3) gesture motion is identified from each action segment.
Specifically, it after the completion of the images of gestures is divided, for each action segment, therefrom identifies effective Gesture motion identifies each complete gesture motion, remove that segment beginning and end is likely to occur it is cut not Complete gesture motion.
(4) judgement amendment is carried out to the gesture motion identified.
Specifically, it is described judge to correct refer to that standard sign language that the gesture motion of user may preserve with database is acted and had It comes in and goes out, including movement range, therefore, the gesture motion identified from the action segment can be modified first, with Preferably matched with the standard sign language action in the database.
Matching module 802, for matching standard sign language action from database according to the gesture motion.
Specifically, by the gesture motion with preserved in the database standard sign language action compare, find out with The standard sign language action that the gesture motion matches.For example, by the gesture motion acquired in Fig. 5 with it is each in the database A standard sign language action is compared, and finds the standard sign language action of matched expression " thanks ".
Searching module 804 acts corresponding voice for searching the standard sign language from the database.
Specifically, each standard sign language is also preserved in the database and acts corresponding voice data.It is described when finding After the matched standard sign language action of gesture motion, the standard sign language is further searched from the database and acts corresponding language Sound, so as to which the gesture motion is translated as the voice.For example, it is acted according to the standard sign language for representing " thanks ", from described Voice " thanks " is found in database.
Output module 806, for exporting found voice.
Specifically, the output module 806 is played to be acted according to the standard sign language by voice playing unit 25 and be searched The voice arrived, so as to which the user for being ignorant of sign language be made to be appreciated that the meaning of the gesture motion.
Embodiment eight
As shown in figure 12, eighth embodiment of the invention proposes a kind of sign language interpretation system 28.
In the present embodiment, it is whole to another movement to be additionally operable to the voice transfer that will be found for the output module 806 End 2 is exported.
Specifically, the output module 806 of the mobile terminal 2 (first movement terminal) will be according to the mark by network 6 The voice transfer that quasi- sign language action is found is to another mobile terminal 2 (the second mobile terminal), so that another described movement Terminal 2 (the second mobile terminal) can play the voice, so as to make another described mobile terminal 2 (the second mobile terminal) right The user answered is appreciated that the meaning of the gesture motion.
Embodiment nine
As shown in figure 12, ninth embodiment of the invention proposes a kind of sign language interpretation system 28.
In the present embodiment, the searching module 804 is additionally operable to search the standard sign language action from the database Corresponding word.
Specifically, each standard sign language is also preserved in the database and acts corresponding lteral data.It is described when finding After the matched standard sign language action of gesture motion, the standard sign language is further searched from the database and acts corresponding text Word, so as to which the gesture motion is translated as the word.For example, it is acted according to the standard sign language for representing " thanks ", from described Corresponding word " thanks " is found in database.
The output module 806 is additionally operable to export found word.
Specifically, the mobile terminal 2 shows the word found according to standard sign language action by screen 26, from And the user for being ignorant of sign language is made to be appreciated that the meaning of the gesture motion.
Further, the word that the output module 806 is additionally operable to be found be sent to another mobile terminal 2 into Row output.
Specifically, the output module 806 of the mobile terminal 2 (first movement terminal) will be according to the mark by network 6 The word that quasi- sign language action is found is sent to another mobile terminal 2 (the second mobile terminal), so that another described movement Terminal 2 (the second mobile terminal) can show the word, so as to make another described mobile terminal 2 (the second mobile terminal) right The user answered is appreciated that the meaning of the gesture motion.
Embodiment ten
As shown in figure 12, tenth embodiment of the invention proposes a kind of sign language interpretation system 28.
In the present embodiment, the acquisition module 800 is additionally operable to acquire voice input by user.
Specifically, when normal person needs to link up with deaf-mute, due to being ignorant of sign language, the mobile terminal can be first passed through Then 2 input voices are translated as corresponding sign language by the mobile terminal 2 and act.At this point, the acquisition module 800 passes through language Sound collecting unit 23 acquires voice input by user.For example, collect " thanks " this section of voice that user says.
The searching module 804 is additionally operable to search the corresponding sign language action of the voice from database.
Specifically, various sign language actions and corresponding meaning (can be that written form represents) are preserved in database.When After collecting voice input by user, first pass through speech recognition technology and the voice is converted into corresponding word, then basis The word being converted to is searched from the database, obtains corresponding sign language action.For example, when collected described When voice is " thanks ", word " thanks " is converted to by speech recognition technology first, is then searched from the database Represented the sign language action of " thanks ".
The output module 806 is additionally operable to export found sign language action.
Specifically, the mobile terminal 2 is played by screen 26 and is acted according to the sign language that the voice is found, so as to make Deaf-mute is appreciated that the meaning of the voice.For example, the sign language action for representing " thanks " is shown in institute by the mobile terminal 2 (refering to shown in Fig. 9) is stated in screen 26, then deaf-mute can understand that the voice is meant that by watching sign language action It says " thanks ".
Embodiment 11
As shown in figure 12, eleventh embodiment of the invention proposes a kind of sign language interpretation system 28.
In the present embodiment, the output module 806 is additionally operable to the sign language found action being sent to another shifting Dynamic terminal 2 is exported.
Specifically, the output module 806 of the mobile terminal 2 (first movement terminal) will be according to institute's predicate by network 6 The sign language action that sound is found is sent to another mobile terminal 2 (the second mobile terminal), so that another described mobile terminal 2 (the second mobile terminal) can play the sign language action, so as to make another described mobile terminal 2 (the second mobile terminal) right The user answered is appreciated that the meaning of the voice.
Embodiment 12
The present invention also provides another embodiments, that is, provide a kind of computer readable storage medium, the computer Readable storage medium storing program for executing is stored with sign language interpreter program, and the sign language interpreter program can be performed by least one processor, so that institute State the step of at least one processor performs sign language interpretation method as described above.
It should be noted that herein, term " comprising ", "comprising" or its any other variant are intended to non-row His property includes, so that process, method, article or device including a series of elements not only include those elements, and And it further includes other elements that are not explicitly listed or further includes intrinsic for this process, method, article or device institute Element.In the absence of more restrictions, the element limited by sentence "including a ...", it is not excluded that including this Also there are other identical elements in the process of element, method, article or device.
The embodiments of the present invention are for illustration only, do not represent the quality of embodiment.
Through the above description of the embodiments, those skilled in the art can be understood that above-described embodiment side Method can add the mode of required general hardware platform to realize by software, naturally it is also possible to by hardware, but in many cases The former is more preferably embodiment.Based on such understanding, technical scheme of the present invention substantially in other words does the prior art Going out the part of contribution can be embodied in the form of software product, which is stored in a storage medium In (such as ROM/RAM, magnetic disc, CD), used including some instructions so that a station terminal (can be mobile phone, computer services Device, air conditioner or network equipment etc.) perform method described in each embodiment of the present invention.
The embodiment of the present invention is described above in conjunction with attached drawing, but the invention is not limited in above-mentioned specific Embodiment, above-mentioned specific embodiment is only schematical rather than restricted, those of ordinary skill in the art Under the enlightenment of the present invention, present inventive concept and scope of the claimed protection are not being departed from, can also made very much Form, these are belonged within the protection of the present invention.

Claims (10)

1. a kind of sign language interpretation method, applied to mobile terminal, which is characterized in that the method comprising the steps of:
Acquire the gesture motion of user;
Standard sign language action is matched from database according to the gesture motion;
The standard sign language is searched from the database and acts corresponding voice;And
The found voice of output.
2. sign language interpretation method according to claim 1, which is characterized in that this method further includes step:
The voice transfer found to another mobile terminal is exported.
3. sign language interpretation method according to claim 1, which is characterized in that this method according to the gesture motion from number Step is further included according to the step of matching standard sign language action in library later:
The standard sign language is searched from the database and acts corresponding word;
It exports found word or the word found is sent to another mobile terminal and export.
4. according to claim 1-3 any one of them sign language interpretation methods, which is characterized in that the gesture of the acquisition user is moved As the step of specifically include:
Shoot images of gestures;
Action segment is carried out to the images of gestures to divide;
Gesture motion is identified from each action segment;
Judgement amendment is carried out to the gesture motion identified.
5. sign language interpretation method according to claim 4, which is characterized in that action segment is being carried out to the images of gestures In the step of division, partitioning standards can be divided according to the time interval of setting, and work as and mark off previous action After segment, the initial time rollback preset time period of latter action segment.
6. sign language interpretation method according to claim 1, which is characterized in that this method further includes step:
Acquire voice input by user;
The corresponding sign language action of the voice is searched from the database;
The found sign language action of output.
7. sign language interpretation method according to claim 6, which is characterized in that this method is searching institute from the database Step is further included after the step of corresponding sign language of predicate sound acts:
The sign language found action is sent to another mobile terminal to export.
8. sign language interpretation method according to claim 6, which is characterized in that the voice is being searched from the database In the step of corresponding sign language action, various sign language actions and corresponding word are preserved in the database, when collecting After predicate sound, first pass through speech recognition technology and the voice is converted into corresponding word, the word that then basis is converted to, It is searched from the database, obtains corresponding sign language action.
9. a kind of mobile terminal, which is characterized in that the mobile terminal includes:Memory, processor, voice collecting unit, figure As collecting unit, voice playing unit, screen and it is stored in the sign language that can be run on the memory and on the processor Interpretive program realizes such as hand described in any item of the claim 1 to 8 when the sign language interpreter program is performed by the processor The step of language interpretation method.
10. a kind of computer readable storage medium, which is characterized in that be stored with sign language on the computer readable storage medium and turn over Translator program realizes such as sign language interpreter described in any item of the claim 1 to 8 when the sign language interpreter program is executed by processor The step of method.
CN201711459455.XA 2017-12-28 2017-12-28 sign language interpretation method, mobile terminal and computer readable storage medium Pending CN108268835A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711459455.XA CN108268835A (en) 2017-12-28 2017-12-28 sign language interpretation method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711459455.XA CN108268835A (en) 2017-12-28 2017-12-28 sign language interpretation method, mobile terminal and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN108268835A true CN108268835A (en) 2018-07-10

Family

ID=62772601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711459455.XA Pending CN108268835A (en) 2017-12-28 2017-12-28 sign language interpretation method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN108268835A (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109063624A (en) * 2018-07-26 2018-12-21 深圳市漫牛医疗有限公司 Information processing method, system, electronic equipment and computer readable storage medium
CN109583359A (en) * 2018-11-26 2019-04-05 北京小米移动软件有限公司 Presentation content recognition methods, device, electronic equipment, machine readable storage medium
CN109920309A (en) * 2019-01-16 2019-06-21 深圳壹账通智能科技有限公司 Sign language conversion method, device, storage medium and terminal
CN109960813A (en) * 2019-03-18 2019-07-02 维沃移动通信有限公司 A kind of interpretation method, mobile terminal and computer readable storage medium
CN110032740A (en) * 2019-04-20 2019-07-19 卢劲松 It customizes individual character semanteme and learns application method
CN110840652A (en) * 2019-11-11 2020-02-28 北京海益同展信息科技有限公司 Wearable device, information processing method and device
CN110931042A (en) * 2019-11-14 2020-03-27 北京欧珀通信有限公司 Simultaneous interpretation method and device, electronic equipment and storage medium
CN111931523A (en) * 2020-04-26 2020-11-13 永康龙飘传感科技有限公司 Method and system for translating characters and sign language in news broadcast in real time
CN112597912A (en) * 2020-12-26 2021-04-02 中国农业银行股份有限公司 Conference content recording method, device, equipment and storage medium
CN112686132A (en) * 2020-12-28 2021-04-20 南京工程学院 Gesture recognition method and device
CN113220912A (en) * 2021-04-07 2021-08-06 深圳市宝尔爱迪科技有限公司 Interactive assistance method and device and computer readable storage medium
CN113449836A (en) * 2021-07-21 2021-09-28 温州亿通自动化设备有限公司 Kowtow counting method and device
CN113657173A (en) * 2021-07-20 2021-11-16 北京搜狗科技发展有限公司 Data processing method and device and data processing device
CN113851029A (en) * 2021-07-30 2021-12-28 阿里巴巴达摩院(杭州)科技有限公司 Barrier-free communication method and device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101605399A (en) * 2008-06-13 2009-12-16 英华达(上海)电子有限公司 A kind of portable terminal and method that realizes Sign Language Recognition
CN101605158A (en) * 2008-06-13 2009-12-16 鸿富锦精密工业(深圳)有限公司 Mobile phone dedicated for deaf-mutes
CN102236986A (en) * 2010-05-06 2011-11-09 鸿富锦精密工业(深圳)有限公司 Sign language translation system, device and method
CN106295603A (en) * 2016-08-18 2017-01-04 广东技术师范学院 Chinese sign language bidirectional translation system, method and apparatus
CN107133361A (en) * 2017-05-31 2017-09-05 北京小米移动软件有限公司 Gesture identification method, device and terminal device
US20170277684A1 (en) * 2016-03-28 2017-09-28 Avaya Inc. Sign language communication with communication devices

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101605399A (en) * 2008-06-13 2009-12-16 英华达(上海)电子有限公司 A kind of portable terminal and method that realizes Sign Language Recognition
CN101605158A (en) * 2008-06-13 2009-12-16 鸿富锦精密工业(深圳)有限公司 Mobile phone dedicated for deaf-mutes
CN102236986A (en) * 2010-05-06 2011-11-09 鸿富锦精密工业(深圳)有限公司 Sign language translation system, device and method
US20170277684A1 (en) * 2016-03-28 2017-09-28 Avaya Inc. Sign language communication with communication devices
CN106295603A (en) * 2016-08-18 2017-01-04 广东技术师范学院 Chinese sign language bidirectional translation system, method and apparatus
CN107133361A (en) * 2017-05-31 2017-09-05 北京小米移动软件有限公司 Gesture identification method, device and terminal device

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109063624A (en) * 2018-07-26 2018-12-21 深圳市漫牛医疗有限公司 Information processing method, system, electronic equipment and computer readable storage medium
CN109583359A (en) * 2018-11-26 2019-04-05 北京小米移动软件有限公司 Presentation content recognition methods, device, electronic equipment, machine readable storage medium
CN109583359B (en) * 2018-11-26 2023-10-24 北京小米移动软件有限公司 Method, apparatus, electronic device, and machine-readable storage medium for recognizing expression content
CN109920309A (en) * 2019-01-16 2019-06-21 深圳壹账通智能科技有限公司 Sign language conversion method, device, storage medium and terminal
CN109960813A (en) * 2019-03-18 2019-07-02 维沃移动通信有限公司 A kind of interpretation method, mobile terminal and computer readable storage medium
CN110032740A (en) * 2019-04-20 2019-07-19 卢劲松 It customizes individual character semanteme and learns application method
CN110840652A (en) * 2019-11-11 2020-02-28 北京海益同展信息科技有限公司 Wearable device, information processing method and device
CN110931042B (en) * 2019-11-14 2022-08-16 北京欧珀通信有限公司 Simultaneous interpretation method and device, electronic equipment and storage medium
CN110931042A (en) * 2019-11-14 2020-03-27 北京欧珀通信有限公司 Simultaneous interpretation method and device, electronic equipment and storage medium
CN111931523A (en) * 2020-04-26 2020-11-13 永康龙飘传感科技有限公司 Method and system for translating characters and sign language in news broadcast in real time
CN112597912A (en) * 2020-12-26 2021-04-02 中国农业银行股份有限公司 Conference content recording method, device, equipment and storage medium
CN112686132A (en) * 2020-12-28 2021-04-20 南京工程学院 Gesture recognition method and device
CN113220912A (en) * 2021-04-07 2021-08-06 深圳市宝尔爱迪科技有限公司 Interactive assistance method and device and computer readable storage medium
CN113657173A (en) * 2021-07-20 2021-11-16 北京搜狗科技发展有限公司 Data processing method and device and data processing device
CN113449836A (en) * 2021-07-21 2021-09-28 温州亿通自动化设备有限公司 Kowtow counting method and device
CN113449836B (en) * 2021-07-21 2023-06-06 温州亿通自动化设备有限公司 Kowtow counting method and device
CN113851029A (en) * 2021-07-30 2021-12-28 阿里巴巴达摩院(杭州)科技有限公司 Barrier-free communication method and device
CN113851029B (en) * 2021-07-30 2023-09-05 阿里巴巴达摩院(杭州)科技有限公司 Barrier-free communication method and device

Similar Documents

Publication Publication Date Title
CN108268835A (en) sign language interpretation method, mobile terminal and computer readable storage medium
CN108289244A (en) Video caption processing method, mobile terminal and computer readable storage medium
CN108093123A (en) A kind of message informing processing method, terminal and computer readable storage medium
CN108536481A (en) A kind of application program launching method, mobile terminal and computer storage media
CN108551411A (en) Collecting method, mobile terminal and computer readable storage medium
CN108762876A (en) A kind of input method switching method, mobile terminal and computer storage media
CN107748645A (en) Reading method, mobile terminal and computer-readable recording medium
CN110321474A (en) Recommended method, device, terminal device and storage medium based on search term
CN107633051A (en) Desktop searching method, mobile terminal and computer-readable recording medium
CN108307043A (en) Speech message conversion method, mobile terminal and computer readable storage medium
CN108492836A (en) A kind of voice-based searching method, mobile terminal and storage medium
CN110180181A (en) Screenshot method, device and the computer readable storage medium of Wonderful time video
CN108551520A (en) A kind of phonetic search response method, equipment and computer readable storage medium
CN108241752A (en) Photo display methods, mobile terminal and computer readable storage medium
CN108418948A (en) A kind of based reminding method, mobile terminal and computer storage media
CN108241467A (en) Application combination operating method, mobile terminal and computer readable storage medium
CN108172161A (en) Display methods, mobile terminal and computer readable storage medium based on flexible screen
CN108762631A (en) A kind of method for controlling mobile terminal, mobile terminal and computer readable storage medium
CN108521500A (en) A kind of voice scenery control method, equipment and computer readable storage medium
CN108197206A (en) Expression packet generation method, mobile terminal and computer readable storage medium
CN109584897A (en) Vedio noise reduction method, mobile terminal and computer readable storage medium
CN108897846A (en) Information search method, equipment and computer readable storage medium
CN108769126A (en) Using recommendation method, mobile terminal and computer readable storage medium
CN108319498A (en) A kind of application scenarios method for pushing, equipment and computer readable storage medium
CN109445945A (en) Memory allocation method, mobile terminal, server and the storage medium of application program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20180710