CN114138119A - Gesture recognition system and method for mobile phone interconnection split screen projection - Google Patents

Gesture recognition system and method for mobile phone interconnection split screen projection Download PDF

Info

Publication number
CN114138119A
CN114138119A CN202111489494.0A CN202111489494A CN114138119A CN 114138119 A CN114138119 A CN 114138119A CN 202111489494 A CN202111489494 A CN 202111489494A CN 114138119 A CN114138119 A CN 114138119A
Authority
CN
China
Prior art keywords
mobile phone
gesture
virtual screen
gesture recognition
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111489494.0A
Other languages
Chinese (zh)
Inventor
梁会
冉龙波
卢超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Carbit Information Co ltd
Original Assignee
Wuhan Carbit Information Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Carbit Information Co ltd filed Critical Wuhan Carbit Information Co ltd
Priority to CN202111489494.0A priority Critical patent/CN114138119A/en
Publication of CN114138119A publication Critical patent/CN114138119A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The invention discloses a gesture recognition system for interconnection and split screen projection of a mobile phone, which comprises a mobile phone and a computer terminal, wherein the mobile phone is used for establishing a split screen projection channel with the computer terminal, and the computer terminal is used for displaying a mobile phone virtual screen interface of the mobile phone; the mobile phone converts the position change of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera when the single-finger gesture moves into the position synchronous change of a mouse type indication icon on a virtual screen of the mobile phone; and the mobile phone converts the gesture confirmation operation recognized in the gesture recognition area of the mobile phone camera into a click event of the current position of a mouse type indication icon on the virtual screen interface of the mobile phone. According to the invention, when the mobile phone is projected in a split screen mode, the idle mobile phone camera can be used for recognizing the gesture action of the user, the split screen of the mobile phone on the computer terminal is controlled, and the user experience and the driving safety are improved.

Description

Gesture recognition system and method for mobile phone interconnection split screen projection
Technical Field
The invention relates to the technical field of computers, in particular to a system and a method for recognizing gestures projected by interconnected screens of a mobile phone.
Background
The mobile phone interconnection split screen projection technology is characterized in that a mobile phone and a computer terminal are in communication connection through a USB (universal serial bus) line or WiFi (wireless fidelity), the mobile phone runs an APP, the computer terminal also runs corresponding APPs at the same time, and the two APPs establish a TCP (transmission control protocol) communication channel through the physical communication channel. During interconnection, the APP establishes an invisible virtual screen in a mobile phone system, the main screen locks the screen or displays a screen locking mask, the content displayed on the virtual screen of the mobile phone is displayed, as shown in fig. 2, the APP sends an interface to a computer terminal through video streaming by capturing the screen and video coding, and the APP at the computer terminal receives the video data sent by the mobile phone through decoding the video streaming. If the screen of the computer terminal is a touch screen, control actions such as clicking and dragging can be directly performed on the touch screen, an APP on the computer terminal forwards an event to an APP at a mobile phone end through an interconnection transmission channel, and the APP is injected into a virtual screen interface created by the mobile phone APP after coordinate conversion, so that reverse control is realized.
By the control mode, when a vehicle drives, the position of the display screen is far away from a driver, and the driver can perform touch operation only by leaning on the side, so that the control mode is not safe and convenient enough;
in addition, if the computer terminal is an instrument terminal without a touch function, touch operation cannot be performed on a displayed interface, and only viewing is performed, so that user experience is poor.
Disclosure of Invention
The invention aims to provide a gesture recognition system and a gesture recognition method for mobile phone interconnection split screen projection.
In order to achieve the purpose, the gesture recognition system for the interconnection split screen projection of the mobile phone comprises the mobile phone and a computer terminal, wherein the mobile phone is used for establishing a split screen projection channel with the computer terminal, the computer terminal is used for displaying a mobile phone virtual screen interface of the mobile phone, and the mobile phone is used for starting a mobile phone camera;
the mobile phone is also used for converting the position change of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera when the single-finger gesture moves into the position synchronous change of a mouse type indication icon on a virtual screen of the mobile phone;
the mobile phone is also used for converting the confirmed gesture operation recognized in the gesture recognition area of the mobile phone camera into a click event of the current position of the mouse type indication icon on the virtual screen interface of the mobile phone.
The invention has the beneficial effects that:
the invention adds a gesture recognition function on the main screen of the mobile phone, when a user swings the palm left and right on the camera of the mobile phone, moves with a single index finger and operates the OK gesture, the palm is converted into a corresponding control event, and the event is directly transmitted to the interface of the virtual screen. And synchronously displaying mouse indication icons on the virtual screen, so that a user can conveniently know the position of the current coordinate point and operation actions, such as clicking, dragging, double clicking and the like. Through the mobile phone interconnection screen projection technology, the computer terminal can see the interface and the operation process of the virtual screen. The invention utilizes the mobile phone camera to carry out gesture recognition, is placed beside the hand of a user, realizes safe, convenient and quick operation, and improves the driving safety and smoothness.
Drawings
FIG. 1 is a schematic structural view of the present invention;
FIG. 2 is a schematic view of a split screen projection of the present invention;
FIG. 3 is a schematic view of the present invention in use.
Wherein, 1-mobile phone, 2-computer terminal.
Detailed Description
The invention is described in further detail below with reference to the following figures and specific examples:
the system for recognizing the gesture of the interconnected split screen projection of the mobile phone as shown in fig. 1 comprises a mobile phone 1 and a computer terminal 2 (such as a car machine and the like), wherein the mobile phone 1 is used for establishing a split screen projection channel with the computer terminal 2, the computer terminal 2 is used for displaying a mobile phone virtual screen interface of the mobile phone 1, the mobile phone 1 is used for starting a mobile phone camera and displaying a gesture recognition function interface, the mobile phone 1 captures a user gesture operation by calling the mobile phone camera by using a gesture recognition function, and a mouse indication icon is arranged on the mobile phone virtual screen interface and indicates a coordinate position which can be clicked currently, as shown in fig. 2;
the mobile phone 1 is further configured to convert a change in a position of a single-finger Gesture when the single-finger Gesture recognized in the Gesture Recognition area of the mobile phone camera moves into a synchronous change in a position of a mouse-like indication icon on a virtual screen of the mobile phone through a Gesture Recognition SDK (software development kit) such as a space cloud Gesture Recognition software (GR), and when the single-finger Gesture is not recognized, the mouse-like indication icon stays at a current position;
the mobile phone 1 is further configured to convert a confirmation Gesture operation (OK Gesture) recognized in the Gesture Recognition area of the mobile phone camera into a click event of the current position of the mouse type indication icon on the virtual screen interface of the mobile phone through a Gesture Recognition SDK (software development kit) such as a getter Recognition software (GR).
In the above technical solution, the mobile phone 1 is further configured to convert a Gesture canceling operation (a fist making Gesture) recognized in a Gesture Recognition area of the mobile phone camera into a Gesture moving operation of canceling the current single finger through a Gesture Recognition SDK (software development kit) such as a getter cloud Gesture Recognition software (GR).
In the above technical solution, the gesture recognition area of the mobile phone camera is parallel to the screen of the computer terminal 2. When the gesture is recognized, the gesture moves left and right in the direction just the same as the direction of the screen, and the accuracy of gesture recognition and coordinate conversion of the virtual screen interface of the mobile phone is guaranteed.
In the above technical solution, the application program of the mobile phone 1 recognizes the current movement of the single-finger gesture through the camera, determines the control event of the position change when the single-finger gesture moves, and converts the coordinates of the single-finger gesture recognized by the mobile phone 1 into the coordinates of the mouse-like indication icon on the virtual screen interface of the mobile phone in a coordinate transformation manner, so that the mouse-like indication icon on the virtual screen interface of the mobile phone performs the synchronous position change.
In the above technical solution, the initial position of the mouse type indication icon on the virtual screen interface of the mobile phone is defaulted to be the central point of the virtual screen interface of the mobile phone. When the user performs gesture operation, the viewed interface is the interface of the virtual screen, so that the user needs to see where the current mouse type indication icon is, and the user can conveniently perform accurate gesture control on the mouse type indication icon. When a user uses a single hand to move up, down, left and right, the virtual screen mouse type indication icon is synchronously moved according to the position and the moving speed identified by the gesture, so that the user can conveniently click.
In the above technical solution, the mobile phone 1 multiplies the moving distance d of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera by a preset coefficient a, the moving distance is used as the moving distance of the mouse-like indication icon on the virtual screen of the mobile phone, the coefficient a is defaulted to 1, and the coefficient a can be adjusted according to the actual use scene. For example, the space for gesture movement is small, and the factor can be increased by 2 times.
In the above technical solution, a specific method for converting the gesture confirmation operation into a click event of the current position of the mouse type indication icon on the virtual screen interface of the mobile phone 1 is as follows: the mobile phone application program records the position coordinates of a mouse type indication icon on a mobile phone virtual screen interface formed by single-finger gesture operation;
when the gesture confirmation operation is recognized in the gesture recognition area of the mobile phone camera, the mobile phone application program converts the coordinate corresponding to the gesture confirmation operation into the position coordinate of the mouse type indication icon on the mobile phone virtual screen interface, and a click event is injected into the position corresponding to the position coordinate of the mouse type indication icon on the mobile phone virtual screen interface at the moment. And carrying out corresponding click action prompt on a mouse type indication icon on the virtual screen interface of the mobile phone, such as a click animation displayed by the mouse.
In the technical scheme, the left-right swinging palm action recognized in the gesture recognition area of the mobile phone camera is defined as a dragging event by the mobile phone 1 and is directly injected into a virtual screen interface of the mobile phone;
in this embodiment, the application layer is controlled to apply an own interface by using an api, a discrete touch event provided by the Android, so as to implement the above injection click event and the drag event.
In the above technical solution, the mobile phone 1 displays the gesture recognition interface to replace the lock screen or display the lock screen mask, or the mobile phone 1 displays the gesture recognition interface by overlapping on the basis of the lock screen or the display of the lock screen mask. When the mobile phone 1 is interconnected with the computer terminal 2, the mobile phone 1 runs an application program, and the mobile phone 1 displays a mask interface for successful interconnection as shown in fig. 2. When a user wants to switch to gesture operation, the user can start the gesture recognition function by clicking the button of the mask interface, start the video recording of the mobile phone camera, and recognize the current user gesture by using the gesture recognition module, as shown in fig. 3.
In the technical scheme, hundred million driving assistants or driving partner application programs are installed on the mobile phone 1, hundred million mobile phone interconnection is installed on the computer terminal 2, and a split screen projection channel between the mobile phone 1 and the computer terminal 2 can be realized by utilizing a mobile phone interconnection technology.
A gesture recognition method for interconnection split screen projection of a mobile phone comprises the following steps:
step 1: a split screen projection channel is established between the mobile phone 1 and the computer terminal 2;
step 2: the computer terminal 2 displays a virtual screen interface of the mobile phone 1, the mobile phone 1 starts a camera of the mobile phone and displays a gesture recognition interface;
and step 3: the mobile phone 1 converts the position change of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera when the single-finger gesture moves into the position synchronous change of a mouse type indication icon on a virtual screen of the mobile phone, and when the single-finger gesture cannot be recognized, the mouse type indication icon stays at the current position;
the mobile phone 1 converts the confirmed gesture operation recognized in the gesture recognition area of the mobile phone camera into a click event of the current position of a mouse type indication icon on the virtual screen interface of the mobile phone;
the mobile phone 1 defines the left-right swinging palm action recognized in the gesture recognition area of the mobile phone camera as a dragging event, and directly injects the dragging event into the virtual screen interface of the mobile phone to control the picture dragging on the virtual screen interface of the mobile phone.
Details not described in this specification are within the skill of the art that are well known to those skilled in the art.

Claims (10)

1. The utility model provides a gesture recognition system that mutual screen splitting of cell-phone throws, it includes cell-phone (1) and computer class terminal (2), and cell-phone (1) is used for establishing the screen splitting with computer class terminal (2) and throws the passageway, its characterized in that: the computer terminal (2) is used for displaying a mobile phone virtual screen interface of the mobile phone (1), and the mobile phone (1) is used for starting a mobile phone camera;
the mobile phone (1) is also used for converting the position change of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera when the single-finger gesture moves into the position synchronous change of a mouse type indication icon on a virtual screen of the mobile phone;
the mobile phone (1) is also used for converting the confirmed gesture operation recognized in the gesture recognition area of the mobile phone camera into a click event of the current position of a mouse type indication icon on the virtual screen interface of the mobile phone.
2. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the mobile phone (1) is also used for converting the gesture cancelling operation recognized in the gesture recognition area of the mobile phone camera into the gesture moving operation of cancelling the current single finger.
3. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the gesture recognition area of the mobile phone camera is parallel to the screen of the computer terminal (2).
4. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the application program of the mobile phone (1) identifies the current single-finger gesture movement action through the camera, judges the control event of position change when the single-finger gesture moves, and converts the coordinates of the single-finger gesture identified by the mobile phone (1) into the coordinates of a mouse type indication icon on a virtual screen interface of the mobile phone in a coordinate transformation mode.
5. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the initial position of the mouse type indication icon on the mobile phone virtual screen interface is defaulted to be the central point of the mobile phone virtual screen interface.
6. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the mobile phone (1) multiplies the moving distance d of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera by a preset coefficient a to be used as the moving distance of a mouse type indication icon on a virtual screen of the mobile phone.
7. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the specific method for converting the gesture confirming operation into the clicking event of the current position of the mouse type indication icon on the virtual screen interface of the mobile phone (1) is as follows: the mobile phone application program records the position coordinates of a mouse type indication icon on a mobile phone virtual screen interface formed by single-finger gesture operation;
when the gesture confirmation operation is recognized in the gesture recognition area of the mobile phone camera, the mobile phone application program converts the coordinate corresponding to the gesture confirmation operation into the position coordinate of the mouse type indication icon on the mobile phone virtual screen interface, and a click event is injected into the position corresponding to the position coordinate of the mouse type indication icon on the mobile phone virtual screen interface at the moment.
8. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the mobile phone (1) defines the left-right swinging palm action recognized in the gesture recognition area of the mobile phone camera as a dragging event and directly injects the dragging event into a virtual screen interface of the mobile phone.
9. The system for recognizing the gestures projected by the interconnected screens of the mobile phone according to claim 1, wherein: the mobile phone (1) displays a gesture recognition interface to replace a screen locking or screen locking mask, or the mobile phone (1) displays the gesture recognition interface in an overlapping mode on the basis of screen locking or screen locking mask displaying.
10. A gesture recognition method for interconnection split screen projection of a mobile phone is characterized by comprising the following steps:
step 1: a split screen projection channel is established between the mobile phone (1) and the computer terminal (2);
step 2: the computer terminal (2) displays a virtual screen interface of the mobile phone (1), the mobile phone (1) starts a camera of the mobile phone and displays a gesture recognition interface;
and step 3: the mobile phone (1) converts the position change of the single-finger gesture recognized in the gesture recognition area of the mobile phone camera when the single-finger gesture moves into the position synchronous change of a mouse type indication icon on a virtual screen of the mobile phone;
the method comprises the following steps that a mobile phone (1) converts a confirmed gesture operation recognized in a gesture recognition area of a mobile phone camera into a click event of the current position of a mouse type indication icon on a virtual screen interface of the mobile phone;
the mobile phone (1) defines the left-right swinging palm action recognized in the gesture recognition area of the mobile phone camera as a dragging event, and directly injects the dragging event into the virtual screen interface of the mobile phone to control the picture dragging on the virtual screen interface of the mobile phone.
CN202111489494.0A 2021-12-08 2021-12-08 Gesture recognition system and method for mobile phone interconnection split screen projection Pending CN114138119A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111489494.0A CN114138119A (en) 2021-12-08 2021-12-08 Gesture recognition system and method for mobile phone interconnection split screen projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111489494.0A CN114138119A (en) 2021-12-08 2021-12-08 Gesture recognition system and method for mobile phone interconnection split screen projection

Publications (1)

Publication Number Publication Date
CN114138119A true CN114138119A (en) 2022-03-04

Family

ID=80384837

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111489494.0A Pending CN114138119A (en) 2021-12-08 2021-12-08 Gesture recognition system and method for mobile phone interconnection split screen projection

Country Status (1)

Country Link
CN (1) CN114138119A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193631A (en) * 2011-05-05 2011-09-21 上海大学 Wearable three-dimensional gesture interaction system and using method thereof
CN104516649A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Intelligent cell phone operating technology based on motion-sensing technology
CN105260028A (en) * 2015-11-11 2016-01-20 武汉卡比特信息有限公司 Method for controlling onboard computer by motion sensing through mobile phone camera
CN108446073A (en) * 2018-03-12 2018-08-24 阿里巴巴集团控股有限公司 A kind of method, apparatus and terminal for simulating mouse action using gesture
CN109696958A (en) * 2018-11-28 2019-04-30 南京华捷艾米软件科技有限公司 A kind of gestural control method and system based on depth transducer gesture identification

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193631A (en) * 2011-05-05 2011-09-21 上海大学 Wearable three-dimensional gesture interaction system and using method thereof
CN104516649A (en) * 2013-09-28 2015-04-15 南京专创知识产权服务有限公司 Intelligent cell phone operating technology based on motion-sensing technology
CN105260028A (en) * 2015-11-11 2016-01-20 武汉卡比特信息有限公司 Method for controlling onboard computer by motion sensing through mobile phone camera
CN108446073A (en) * 2018-03-12 2018-08-24 阿里巴巴集团控股有限公司 A kind of method, apparatus and terminal for simulating mouse action using gesture
CN109696958A (en) * 2018-11-28 2019-04-30 南京华捷艾米软件科技有限公司 A kind of gestural control method and system based on depth transducer gesture identification

Similar Documents

Publication Publication Date Title
RU2491610C1 (en) Method and device for inertial movement of window object
TWI633460B (en) Object control method and apparatus of user device
CN101950211B (en) Pen type input equipment and use the input method of this equipment
CN102298455B (en) Realization method and system for remote controller with mouse function
US20130234959A1 (en) System and method for linking and controlling terminals
US9632642B2 (en) Terminal apparatus and associated methodology for automated scroll based on moving speed
WO2012051766A1 (en) Method and mobile terminal for automatically identifying rotary gesture
WO2017114318A1 (en) Control method and control device for working mode of touch screen
US20150026649A1 (en) Method, apparatus and system for controlling computer terminal
CN103324348A (en) Windows desktop control method based on intelligent mobile terminals
WO2013123693A1 (en) Remote control method of multi-mode remote controller, remote controller, user terminal and system
WO2012051770A1 (en) Method and mobile terminal for recognizing hardware gestures
US20210232232A1 (en) Gesture-based manipulation method and terminal device
CN103577108A (en) Method and system for transmitting video file
CN102968245B (en) Mouse touches cooperative control method, device and Intelligent television interaction method, system
CN106896920B (en) Virtual reality system, virtual reality equipment, virtual reality control device and method
WO2014183351A1 (en) Handwriting input method and device
CN111770368A (en) Control method and device for large-screen display equipment, storage medium and electronic equipment
CN103561308A (en) Interactive method for remotely controlling smart television through smart mobile phone
US9383920B2 (en) Method for controlling two or three dimensional figure based on touch and apparatus thereof
CN106657609A (en) Virtual reality device and control device and method thereof
WO2022017421A1 (en) Interaction method, display device, emission device, interaction system, and storage medium
JP2001117686A (en) Pen-inputting device and pointing processing method for the device
CN109885236B (en) Method for realizing interactive operation with remote system desktop based on mobile equipment
CN105094344B (en) Fixed terminal control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination