CN107918481B - Man-machine interaction method and system based on gesture recognition - Google Patents

Man-machine interaction method and system based on gesture recognition Download PDF

Info

Publication number
CN107918481B
CN107918481B CN201610878812.5A CN201610878812A CN107918481B CN 107918481 B CN107918481 B CN 107918481B CN 201610878812 A CN201610878812 A CN 201610878812A CN 107918481 B CN107918481 B CN 107918481B
Authority
CN
China
Prior art keywords
control
user
area
slider
cursor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610878812.5A
Other languages
Chinese (zh)
Other versions
CN107918481A (en
Inventor
党建勋
刘津甦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiaoniu Technology Co ltd
Original Assignee
Shenzhen Qiaoniu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qiaoniu Technology Co ltd filed Critical Shenzhen Qiaoniu Technology Co ltd
Priority to CN201610878812.5A priority Critical patent/CN107918481B/en
Publication of CN107918481A publication Critical patent/CN107918481A/en
Application granted granted Critical
Publication of CN107918481B publication Critical patent/CN107918481B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A man-machine interaction method and system based on gesture recognition are disclosed. The man-machine interaction method based on the gesture recognition is disclosed, wherein a control recognizes a command of a user based on gesture information of the user, the control comprises an inlet area and a guide rail area, the inlet area divides a user interface into a first part and a second part, the first part does not comprise the guide rail area, and the second part comprises the guide rail area; the method comprises the following steps: in response to a cursor moving from a first portion of the user interface through the entry area to a second portion of the user interface, the control enters an activated state and displays a slider in a rail area of the control; the control generates a first event in response to the slider moving out of the rail area from the first end of the rail area.

Description

Man-machine interaction method and system based on gesture recognition
Technical Field
The invention relates to the field of human-computer interaction. In particular, the invention relates to a method and a system for human-computer interaction by utilizing a control based on gesture recognition.
Background
In the human-computer interaction technology, a control is a reusable software component used for constructing a graphical user interface. Generally, one control corresponds to one function. For example, FIG. 1 illustrates a "confirm" control in a two-dimensional graphical user interface. The 'confirm' control comprises a prompt window, and the prompt window comprises an 'confirm' button and a 'cancel' button. When the 'confirm' control is called, a prompt window as shown in fig. 1 pops up, clicking of the 'confirm' button or the 'cancel' button by the user is recognized to obtain the operation intention of the user, and man-machine interaction is realized. A slide unlocking technique in the related art informs an information processing apparatus of an input intention of a user by sliding a hand on a touch screen.
Novel human-computer interaction technology is also continuously developed, and the human-computer interaction technology based on gesture recognition is one of the hotspots. The identification of hand movements can be achieved in a number of ways. US20100199228A1 from microsoft corporation (published: 8/5 2010) provides a solution for capturing and analyzing a user's body gestures with a depth camera and interpreting them as computer commands. US20080291160A1 from Nintendo corporation (published: 2008-11-27) provides a solution for capturing the position of a user's hand using an infrared sensor and an acceleration sensor. CN1276572A from panasonic electric products corporation provides that a hand is photographed by using a camera, then the image is subjected to normalized analysis, the normalized image is subjected to spatial projection, and the obtained projection coordinates are compared with the projection coordinates of the image stored in advance. Fig. 2 shows a system and method for gesture recognition and spatial location perception provided in patent application CN201110100532.9 from tianjinfeng interactive technology ltd. As shown in fig. 2, the gesture recognition system includes: the system comprises a computer host 101, a control circuit 102 of a multi-camera system, a plurality of cameras 103, a user hand 104, an application program 105 running on the computer host 101, an operated object 106 in the application program 105 and a virtual hand cursor 107. The gesture recognition system also includes an infrared illumination source, not shown in fig. 2, for illuminating the user's hand 104 and an infrared filter placed in front of each camera. The plurality of cameras 103 capture images of the user's hands 104, and the control circuit 102 processes the hand images captured by the cameras 103 and identifies the pose and/or position of the hands. In addition, there are also prior art approaches that utilize data gloves to assist in the recognition of hand gestures.
Disclosure of Invention
In the process of man-machine interaction based on gesture recognition, a control needs to be designed to promote the development of an application program. Controls take gestures as input and produce events or messages as output. The event or message may indicate the user's operational purpose of "confirming" or "cancelling", or indicate a user's intent in a variety of different meanings. Moreover, the existing human-computer interaction technology is difficult to effectively understand the intention of gesture input because the human biological characteristics determine the problem that the track of the hand of the user in the interaction space cannot be straight or standard.
According to a first aspect of the present invention, a first human-machine interaction method based on gesture recognition is provided, wherein a control recognizes a command of a user based on gesture information of the user, the control includes an entry area and a guide rail area, the entry area divides a user interface into a first part and a second part, the first part does not include the guide rail area, and the second part includes the guide rail area; the method comprises the following steps: in response to a cursor moving from a first portion of the user interface through the entry area to a second portion of the user interface, the control enters an activated state and displays a slider in a rail area of the control; the control generates a first event in response to the slider moving out of the rail area from the first end of the rail area.
According to the first human-computer interaction mode based on gesture recognition of the first aspect of the invention, a second human-computer interaction method based on gesture recognition of the first aspect of the invention is provided, wherein the method comprises the following steps: the control generates a second event in response to the slider moving out of the rail area from the second end of the rail area.
According to the aforementioned human-computer interaction manner of the first aspect of the present invention, there is provided a third human-computer interaction method based on gesture recognition according to the first aspect of the present invention, where the method includes: the control enters an inactive state in response to the slider moving out of the rail area from the first end or the second end of the rail area.
According to the foregoing human-computer interaction manner of the first aspect of the present invention, there is provided a fourth human-computer interaction method according to the first aspect of the present invention, including: and initializing the control, wherein the control enters an inactivated state.
According to the aforementioned human-computer interaction manner of the first aspect of the invention, there is provided the fifth human-computer interaction method of the first aspect of the invention, wherein the first end is a portion of the guideway region near the entry region.
According to the second human-computer interaction manner based on gesture recognition of the first aspect of the present invention, a sixth human-computer interaction method based on gesture recognition of the first aspect of the present invention is provided, wherein the second end is a portion of the guideway region away from the entry region.
According to the foregoing human-computer interaction manner of the first aspect of the present invention, there is provided a seventh human-computer interaction method according to the first aspect of the present invention, including: a cursor is displayed on the user interface in accordance with gesture information, and the gesture information indicates a position and/or a posture of the user's hand extracted from an image of the user's hand captured by the image capture device.
According to the aforementioned human-computer interaction manner of the first aspect of the present invention, there is provided an eighth human-computer interaction method according to the first aspect of the present invention, including: and drawing the sliding block in the guide rail area according to the projection position of the cursor on the central line of the guide rail area.
According to the aforementioned human-computer interaction manner of the first aspect of the present invention, there is provided a ninth human-computer interaction method according to the first aspect of the present invention, including: in response to the control entering the activated state, hiding a cursor on the user interface and changing an appearance of the control to prompt the user for the control to enter the activated state.
According to the foregoing human-computer interaction manner of the first aspect of the present invention, there is provided a tenth human-computer interaction method according to the first aspect of the present invention, including: and responding to the control to enter the inactivated state, and displaying a cursor on the user interface according to the position supported by the gesture information.
According to the aforementioned human-computer interaction manner of the first aspect of the present invention, there is provided an eleventh human-computer interaction method according to the first aspect of the present invention, including: concealing the slider in response to the slider moving out of the rail area.
According to the foregoing human-computer interaction manner of the first aspect of the present invention, there is provided a twelfth human-computer interaction method according to the first aspect of the present invention, including: and in the activated state of the control, responding to the 'grabbing' action indicated by the gesture information, fixing the cursor on the sliding block, and drawing the sliding block according to the gesture information.
According to the first human-computer interaction mode based on gesture recognition of the first aspect of the invention, a thirteenth human-computer interaction method according to the first aspect of the invention is provided, wherein the first ends of the guide rail areas are multiple.
According to a second man-machine interaction mode based on gesture recognition of the first aspect of the invention, a fourteenth man-machine interaction method according to the first aspect of the invention is provided, wherein the number of the second ends of the guide rail areas is multiple.
According to a second aspect of the present invention, there is provided a first human-computer interaction device based on gesture recognition according to the second aspect of the present invention, wherein a control recognizes a command of a user based on gesture information of the user, the control includes an entry area and a rail area, the entry area divides the user interface into a first portion and a second portion, the first portion does not include the rail area, and the second portion includes the rail area; the device comprises: the activation module is used for responding to the cursor moving from the first part of the user interface to the second part of the user interface through the entrance area, enabling the control to enter an activation state, and displaying a sliding block in the guide rail area of the control; an event generation module that generates a first event in response to the slider moving out of the rail region from the first end of the rail region.
According to the first human-computer interaction device based on gesture recognition of the second aspect of the invention, the second human-computer interaction device of the second aspect of the invention is provided, which comprises: a second event generation module to generate a second event in response to the slider moving out of the rail region from the second end of the rail region.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided a third human-computer interaction device of the second aspect of the present invention, comprising: the control is in an inactive state in response to the slider moving out of the rail area from the first end or the second end of the rail area.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided a fourth human-computer interaction device of the second aspect of the present invention, comprising: and the initialization module is used for initializing the control, and the control enters an inactivated state.
According to the aforementioned human-computer interaction device of the second aspect of the invention, there is provided a fifth human-computer interaction device of the second aspect of the invention, wherein the second end is a portion of the rail area proximate to the entrance area.
According to a second human machine interaction device of the second aspect of the invention, there is provided a sixth human machine interaction device of the second aspect of the invention, wherein the first end is a portion of the rail area remote from the entry area.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided a seventh human-computer interaction device of the second aspect of the present invention, further comprising: means for displaying on the user interface in dependence on gesture information, and the gesture information is indicative of a position and/or posture of the user's hand extracted from an image of the user's hand captured by the image capture device.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided an eighth human-computer interaction device of the second aspect of the present invention, comprising: and the sliding block drawing module is used for drawing the sliding block in the guide rail area according to the projection position of the cursor on the central line of the guide rail area.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided a ninth human-computer interaction device of the second aspect of the present invention, comprising: and the appearance changing module is used for responding to the control entering the activated state, hiding a cursor on the user interface, playing specified sound, displaying specified characters and/or providing mechanical feedback to prompt the user that the control enters the activated state.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided a tenth human-computer interaction device of the second aspect of the present invention, comprising: and the cursor display module is used for responding to the control to enter an inactivated state, displaying a cursor on the user interface according to the position supported by the gesture information, playing appointed sound, displaying appointed characters and/or providing mechanical feedback.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided an eleventh human-computer interaction device of the second aspect of the present invention, comprising: a slider concealment module to conceal the slider in response to the slider moving out of the rail region.
According to the aforementioned human-computer interaction device of the second aspect of the present invention, there is provided a twelfth human-computer interaction device of the second aspect of the present invention, comprising: and the cursor fixing module is used for responding to the 'grabbing' action indicated by the gesture information in the activated state of the control, fixing the cursor on the sliding block and drawing the sliding block according to the gesture information.
According to a first human-computer interaction device of the second aspect of the invention, there is provided a thirteenth human-computer interaction device of the second aspect of the invention, wherein the first end of the rail area is plural.
According to a second human-computer interaction device of the second aspect of the invention, there is provided a fourteenth human-computer interaction device of the second aspect of the invention, wherein the second end of the rail area is plural.
According to a third aspect of the present invention, there is provided an information processing apparatus, wherein the information processing apparatus comprises a processor, a memory, and a display device, the information processing apparatus further coupled to a gesture recognition apparatus and receiving gesture information provided by the gesture recognition apparatus; the memory stores a program, and the processor executes the program to cause the information processing apparatus to execute the aforementioned human-computer interaction method according to the first aspect of the present invention.
According to a fourth aspect of the present invention, there is provided a computer program which, when executed by a processor of an information processing apparatus, causes the information processing apparatus to perform one of the aforementioned plurality of human-computer interaction methods according to the first aspect of the present invention.
Drawings
The invention, as well as a preferred mode of use and further objectives and advantages thereof, will best be understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying drawings, wherein:
FIG. 1 illustrates a prior art "confirm" control for a two-dimensional graphical user interface;
FIG. 2 is a schematic diagram of a gesture recognition system in the prior art;
FIG. 3 is a block diagram of a human-computer interaction system based on gesture recognition according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a gesture recognition based control in a two-dimensional user interface in accordance with an embodiment of the present invention;
5A-5D are schematic diagrams of multiple states of a gesture recognition based control in a two-dimensional user interface, according to embodiments of the invention;
FIG. 6 is a flow chart of a method for human-computer interaction based on gesture recognition in a two-dimensional user interface according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a gesture recognition based control in a three-dimensional user interface in accordance with an embodiment of the present invention;
8A-8D are schematic diagrams of multiple states of a gesture recognition based control in a three-dimensional user interface, according to embodiments of the present invention; and
fig. 9 is a block diagram of an information processing apparatus implementing an embodiment of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention. On the contrary, the embodiments of the invention include all changes, modifications and equivalents coming within the spirit and terms of the claims appended hereto.
In the description of the present invention, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "connected" and "connected" are to be interpreted broadly, e.g., as being fixed or detachable or integrally connected; can be mechanically or electrically connected; may be directly connected or indirectly connected through an intermediate. The specific meanings of the above terms in the present invention can be understood in a specific case to those of ordinary skill in the art. In addition, in the description of the present invention, "a plurality" means two or more unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
FIG. 3 is a block diagram of a human-computer interaction system based on gesture recognition according to an embodiment of the present invention. The human-computer interaction system according to an embodiment of the present invention includes a gesture input device 310, an information processing device 320, and a display device 330 coupled to each other. In one example, the gesture input device 310 is configured to capture an image of a hand of a user and send the captured image to an information processing device for processing. And the information processing device 320 is used for receiving the hand images sent by the gesture input device and identifying gesture information of the user hands in the images. The information processing device 320 also presents graphics and/or images to the user through the display device 330, such as drawing a virtual image of the user's hand on the display device 330. The information processing device may be, for example, a computer, a mobile phone, or a dedicated gesture recognition device. The display device 330 may be, for example, a flat panel display, a projector, a head mounted display.
In another example, the gesture input device 310 senses the position and/or posture of the user's hand, recognizes gesture information of the user's hand, and transmits the user's hand information to the information processing device 320. The information processing device 320 recognizes the user hand information provided by the gesture input device 310 as an input provided by the user and provides an output to the user through the display device 330 to enable human-computer interaction. Obviously, the information processing device 320 may also interact with the user through sound, mechanical action, and the like.
As still another example, the gesture input device 310 may also be, for example, a depth sensor, a distance sensor, a VR controller (such as an Oculus Rift Touch), a gamepad, a data glove (such as CyberGlove), a motion capture system (such as OptiTracker), a gyroscope, etc., for sensing the position and/or pose of a user's hand.
Gesture information (i) based on a virtual coordinate system is extracted from gestures and/or actions performed by a user in the real world. Gesture information (i) may be a vector and formally expressed as i = { C, palm, thumb, index, mid, ring, little }. Where c denotes a hand type of the entire hand, for example, a fist, five fingers are opened, victory gesture, etc., palm denotes position information indicating a palm of the hand, and thumb, index, mid, ring, and little denote position information and/or orientation information of a thumb, an index finger, a middle finger, a ring finger, and a little finger, respectively. And wherein the virtual coordinate system is used to show the position information in the virtual world constructed by the information processing apparatus 320. And the position information of the object or space in the real world is shown by a real coordinate system. The virtual world constructed by the information processing apparatus 320 may be, for example, a two-dimensional space of a two-dimensional graphical user interface, a three-dimensional space, or a virtual reality scene in which a user is fused. The real coordinate system and the virtual coordinate system can be a two-dimensional coordinate system or a three-dimensional coordinate system. The gesture information (i) may be updated at a certain frequency or time interval, or when the position and/or posture of the user's hand changes.
On the user interface, a cursor may be displayed for providing a visual response to the user in accordance with the gesture information (i). The position of the cursor on the graphical interface may be represented as a function of the gesture information (i), e.g., func _ a (i). Those skilled in the art will appreciate that the function func _ a may vary according to different application scenarios or settings.
For example, in a two-dimensional user interface, the position at which the cursor is to be drawn is calculated by equation (1):
Func_a(i)=C*0+palm*0+index.position*0.5+mid*0+little*0 (1)
(1) Position refers to the position of the user's index finger, and thus, as can be seen from equation (1), the position of the cursor on the user interface depends only on the position of the user's index finger, and the distance that the cursor moves on the user interface is half the distance that the user's index finger moves.
The cursor may have a single style, such as a hand shape. The cursor may also have multiple styles corresponding to different hand styles.
In connection with fig. 4-6, it is described how a control can be operated by gestures in a two-dimensional user interface.
FIG. 4 is a schematic diagram of a gesture recognition based control in a two-dimensional user interface according to an embodiment of the present invention. Referring to fig. 4, a control based on gesture recognition in a two-dimensional user interface according to an embodiment of the present invention includes: an inlet region and a guide rail region. The entry area in fig. 4 is a line segment. In another example, the inlet region may be a curve. The inlet area divides the two-dimensional plane of the user interface into two parts, one side comprising the rail area is called the rail side and the other side is called the free side. In fig. 4, the rail area is rectangular. It is apparent that in other examples, the rail area may have other shapes, such as line segments, triangles, ovals, and the like. The entry area and the rail area may be rendered on a two-dimensional user interface to prompt a user where the control is located. In another example, the entry area and/or the rail area may be hidden so as not to affect what is presented on the user interface. The rail area is adjacent or contiguous to the inlet area. The portion of the rail area near the entrance area is referred to as the entrance end, and the portion of the rail area away from the entrance area is referred to as the exit end. In another example, to make the hover operation with the hand easier to be recognized by the user, the entry area of the control and the rail area are shown as a notch or a bell, so as to easily guide the user to enter the cursor into the rail area through a gesture.
In the example of fig. 4, the rail area also includes a slider. The slider is movable along the rail area.
5A-5D are schematic diagrams of multiple states of a gesture-based recognition control in a two-dimensional user interface, according to embodiments of the invention.
The gesture recognition based control has an active state and an inactive state. The inactive state is the initial state of the control. FIG. 5A illustrates the control in an inactive state, and a cursor associated with gesture information (i). Note that in fig. 5A, the slider is not drawn on the rail area or hidden. The absence of a slider in the rail area may be a prompt to the user to inform the user that the control is in the inactive state.
When the gesture of the user enables the cursor to enter the guide rail side from the free side through the inlet area, the control is converted from the inactive state to the active state. The control receives an event indicating gesture information (i) and recognizes a change in the gesture information (i) that causes a cursor position to occur. When the gesture information (i) causes the cursor position to enter the guide side from the free side of the control via the entry area, the control state is changed to the activated state. And drawing the active state control.
Optionally, drawing a slider on the rail area when the user's gesture causes the cursor to enter the rail side from the free side through the entry area, and also displaying the cursor; the user moves the cursor by hand over the slider and by a "grip" action the cursor is fixed to the slider and, subsequently, the slider will follow the cursor movement. The "grab" action is not necessary, and in one embodiment, the slider follows the cursor movement while the control is in the active state, or the control changes the slider position based on gesture information (i). Still optionally, a designated sound is also played, a visual presentation is changed, and/or mechanical feedback is provided to the user as the slider is moved. For example, the sound played gradually increases and/or gradually increases in frequency as the slider moves toward the exit end, and the sound played gradually decreases and/or gradually decreases in frequency as the slider moves toward the exit end.
FIG. 5B illustrates the control in an activated state. In the active state of the control, the rail region includes a slider. The slider is associated with gesture information (i) to give a prompt to the user of his hand position and hides the cursor. By way of example, the position of the drawing slider is determined in accordance with rules that determine the position of the drawing cursor. Further, the position of the drawing slider is limited to the guide rail area, so that it appears that the slider moves along the guide rail. In the example of fig. 5B, the rail area includes a midline. The projected position of the cursor (not drawn) position on the midline is the position where the slider is drawn in the rail area. Optionally, the appearance of the control is changed to prompt the user that the control is activated and enters the activated state. For example, drawing a shadow along the edge of the control, and/or changing the color of the control area, and/or displaying the specified text. Still optionally, the user is prompted by providing mechanical feedback to the user, and/or playing a designated sound, to prompt the user that the control is activated and enters into activation.
If the user's gesture causes the cursor to enter the guide rail area from the guide rail side, or the user's gesture causes the cursor to bypass the entry area and enter the guide rail side, the control in the inactive state will not be changed into the active state. And the slider will not be displayed but still display the cursor.
In an embodiment according to the invention, a "confirm" command is indicated to the control on behalf of the user if the user moves the slider out of the guide track from the exit end of the guide track area by a gesture, and a "cancel" command is indicated to the control on behalf of the user if the user moves the slider out of the guide track from the entry end of the guide track area by a gesture.
FIG. 5C illustrates a control that receives an "acknowledge" command. For the active control, for example, the user moves the index finger to the right, and the position of the slider (shown as a dashed line pattern in fig. 5C) correspondingly moves to the right along the rail region, so as to provide visual feedback to the user in time to let the user know whether the control correctly recognizes his intention. When the control detects that the slider is moved out of the rail area from the exit end of the rail area as the user's index finger moves, the control generates an event representing a "confirm" command. By processing events representing "confirm" commands, the operation associated with the control is confirmed. As the slider moves out of the rail area, the slider is hidden, e.g., the slider is drawn only in the rail area, while the portion of the slider outside the rail area is hidden. Optionally, the appearance of the control is changed to prompt the user that the control recognizes the user's intent and generates a "confirm" event. For example, flashing the control area, and/or changing the color of the control area, and/or displaying the specified text. Still optionally, mechanical feedback is provided to the user, and/or a specified sound is played to prompt the user that the control identified the user's intent. Further, as the slider moves out of the rail region, the state of the control transitions to an inactive state and a cursor is drawn to track the user's gesture.
FIG. 5D illustrates a control that receives a "cancel" command. For the active control, by way of example, the user moves the index finger to the left, the position of the slider (shown as a dashed-line pattern of sliders in fig. 5D) correspondingly moves to the left along the rail region, and visual feedback is provided to the user to let the user know whether the control correctly recognized his or her intent. When the control detects that the slider has moved out of the track area from the entry end of the track area as the user's index finger moves, the control generates an event representing a "cancel" command, or generates no event, indicating that the user has not made an "acknowledge" indication, or indicating that the user has aborted or cancelled the original attempt. By processing an event representing a "cancel" command, the operation associated with the control is cancelled or ignored. The slider is hidden as it moves out of the guide rail area. Optionally, the appearance of the control is changed, and/or mechanical feedback is provided to the user to prompt the user that the control recognized the user's intent and generated a "cancel" event. Further, as the slider moves out of the rail region, the state of the control transitions to an inactive state and a cursor is drawn to track the user's gesture.
In another embodiment according to the invention, the guide rail area has a cross-like shape. In the active state, the control generates a "confirm" event when the slider is moved out of the guide rail from the right or above; and when the slider moves out of the guide rail from the left or below, the control generates a "cancel" event.
Those skilled in the art will appreciate that the rail area may have multiple exits. When the slider is moved out of some of the outlets, the control generates a "confirm" event, and when the slider is moved out of other outlets, the control generates a "cancel" event. To prompt the user for different meanings of the exit, different indications of the meaning of the exit may be provided to the user at each exit direction by visual, audio and/or mechanical feedback.
In yet another embodiment according to the invention, the guide track area has a cross-like shape and has a plurality of outlets, one for each branch of the cross-like area and each outlet indicates a different meaning or a plurality of outlets indicates a plurality of meanings. For example, when the slider is moved out of the upper exit, the control generates a "cancel" event indicating that the user has abandoned an attempt to play music; when the slider is moved out of the right exit, the control generates a "mute" event indicating that the user wishes to immediately drop the volume of the audio output from an application to 0; when the slider is moved out of the upper outlet, the control generates a "set to high sample rate" event; and when the slider is moved out of the lower exit, the control generates a "set to low sample rate" event. And in response to receiving the event which is output by the control and indicates different meanings or different commands, the application program performs corresponding processing.
FIG. 6 is a flowchart of a method for human-computer interaction based on gesture recognition in a two-dimensional user interface according to an embodiment of the present invention. To use a control according to an embodiment of the present invention, a control is initialized (610). The control initialization process includes drawing a control on a user interface, for example, drawing a control in a user interface as shown in FIG. 5A. And causing the control to receive gesture information (i). Optionally, a cursor is also drawn on the user interface, the position of the drawn cursor being associated with the gesture information (i). In another example, a cursor is drawn on the user interface by the program or other program to which the control applies. And (5) the control receives the gesture information (i) and acquires the position of the cursor from the gesture information (i). In response to the cursor moving from the free side through the entry area to the rail side, the control enters an active state (620). FIG. 5B illustrates the control in the active state. Optionally, when the control enters the active state, the control also changes its appearance, generates a specified sound, and/or provides mechanical feedback to prompt the user that the control enters the active state. The control also draws a slider in the rail area. The position of the drawing slider is defined in the area of the guide rail such that it appears that the slider moves along the guide rail. And moving the slider to follow the user's gesture. By way of example, the position of the drawing slider is determined in accordance with rules that determine the position of the drawing cursor. Further, the projection of the cursor position on the central line of the guide rail area is used as the position for drawing the sliding block.
And (5) acquiring the position of the sliding block in the control gesture information (i). The control detects whether the slider is moved out of the rail area from one side of the rail area (640). Referring to FIG. 5C, when the control detects that the slider has moved out of the rail area from the exit end of the rail area, the control generates a first event (650). As an example, the first event may be a "confirm" event, a "mute" event, or the like. And referring to fig. 5D, when the control detects that the slider is moved out of the rail area from the entrance end of the rail area, the control generates a second event (650). The second event may be a "cancel" event, a "set to high adoption" event, or the like.
In step 650, as the first event is generated, the control enters an inactive state. Optionally, the control is changed in appearance, a specified sound is generated, and/or mechanical feedback is provided to prompt the user that the control recognized the user's intent and generated the first event. And optionally drawing a cursor to track the user's gesture.
At step 660, as the second event is generated, the control enters the inactive state. Optionally, the control is changed in appearance, a specified sound is generated, and/or mechanical feedback is provided to prompt the user that the control recognizes the user's intent and generates a second event. And optionally, drawing a cursor to track the user's gesture.
In a further embodiment, the user creates a space in the virtual world according to an embodiment of the invention, and/or sets or changes the location of the control. The user may place the controls in a convenient location for operation. For example, when the user's arm is fully extended sideways, the cursor is located. Therefore, the user can conveniently indicate commands such as 'confirm'/'cancel' and the like, and the operation on other objects in the virtual world is not influenced.
Embodiments of gesture-based recognition controls according to the present invention in a three-dimensional user interface are described in detail below in conjunction with FIGS. 7-8. FIG. 7 is a diagram of a gesture recognition based control in a three-dimensional user interface, according to an embodiment of the present invention. Referring to fig. 7, a control based on gesture recognition in a three-dimensional user interface according to an embodiment of the present invention includes: an inlet region and a guide rail region. The inlet area in fig. 7 is a finite rectangular plane. In another example, the inlet region may be a region enclosed by a closed curve on a curved or planar surface. In fig. 7, the plane of the inlet area divides the three-dimensional space of the user interface into two parts, one side comprising the rail area being referred to as the rail side and the other side as the free side. In fig. 7, the rail region is a rectangular parallelepiped. It is apparent that in other examples the rail area may have other shapes, such as a cylinder, a sphere, an ellipsoid etc. The entry area and the rail area may be rendered on a three-dimensional user interface to prompt a user where the control is located. The rail area may be fused with an object of a three-dimensional user interface, such as a vase, a mailbox, etc. of the user interface. In another example, the access area and/or the rail area may be hidden from affecting what is presented on the user interface. The rail area is adjacent or contiguous to the inlet area. The portion of the guide rail area near the inlet area is referred to as the inlet end, while the portion of the guide rail area away from the inlet area is referred to as the outlet end.
In the example of fig. 7, the rail area also includes a slider. The slide is movable along the guide rail area. Optionally, the control further includes a guide rail line, and in fig. 7, the guide rail line is a central line along a long axis direction of the rectangular parallelepiped of the guide rail region. One end of the guide line is on the inlet area. The slider moves along the guide rail line.
8A-8D are diagrams of various states of a gesture recognition based control in a three-dimensional user interface, according to embodiments of the invention.
FIG. 8A illustrates the control in an inactive state, and a cursor associated with gesture information (i). And on the control in the non-activated state, the slide block is not drawn on the guide rail area or is hidden. The absence of a slider in the rail area is a prompt to the user to inform the user that the control is in the inactive state.
When the gesture of the user causes the cursor to enter the guide rail side from the free side through the inlet area, the control is converted from the inactivated state to the activated state. The control receives an event indicating gesture information (i) and recognizes a change in the gesture information (i) that causes a cursor position to occur. When the gesture information (i) causes the cursor position to enter the guide side from the free side of the control via the entry area, the control state is changed to the activated state. And drawing the active state control.
Optionally, drawing a slider on the rail area when the user's gesture causes the cursor to enter the rail side from the free side through the entry area, and also displaying the cursor; a user moves a cursor to the sliding block through gestures and fixes the cursor to the sliding block through a 'grasping' action; and, next, the slider will follow the cursor movement.
FIG. 8B illustrates the control in an activated state. In the active state of the control, the rail region includes a slider. The slider is associated with gesture information (i) to give a user a hint of his hand position and to hide the cursor. By way of example, the position of the drawing slider is determined in accordance with rules that determine the position of the drawing cursor. Further, the position of the drawing slider is limited to the guide rail area, so that it appears that the slider moves along the guide rail. In the example of fig. 8B, the guide rail region includes a guide rail line. The projected position of the cursor (not drawn) on the rail line is the position where the slider is drawn in the rail area. Optionally, the appearance of the control is changed to prompt the user that the control is activated and enters an activated state. For example, drawing a shadow along the edge of the control, and/or changing the color of the control region, and/or playing a specified sound, and/or displaying a specified text.
If the user's gesture causes the cursor to enter the rail area from the rail side, or the user's gesture causes the cursor to bypass the entry area and enter the rail side, the control in the inactive state will not be changed to the active state. And the slider will not be displayed but still display the cursor.
In an embodiment according to the invention, a "confirm" command is indicated to the control on behalf of the user if the user moves the slider out of the guide track from the exit end of the guide track area by a gesture, and a "cancel" command is indicated to the control on behalf of the user if the user moves the slider out of the guide track from the entry end of the guide track area by a gesture. And optionally, if the gesture of the user attempts to move the slider out of the guide rail area from the area outside the inlet end or the outlet end, the slider is limited in the guide rail area, and the control is still in the non-activated state.
FIG. 8C illustrates a control that receives an "acknowledge" command. For the active control, for example, the user moves the index finger to the right, and the position of the slider (shown as a dashed line pattern in fig. 8C) correspondingly moves to the right along the rail area, so as to provide visual feedback to the user in time to let the user know whether the control correctly recognizes his intention. When the control detects that the slider is moved out of the rail area from the exit end of the rail area as the user's index finger moves, the control generates an event representing a "confirm" command. By processing events representing "confirm" commands, the operation associated with the control is confirmed. The slider is hidden as it moves out of the guide rail area. Further, as the slider moves out of the rail region, the state of the control transitions to an inactive state and a cursor is drawn to track the user's gesture.
FIG. 8D illustrates a control that receives a "cancel" command. For the active control, by way of example, the user moves the index finger to the left, the position of the slider (shown as a dashed-line pattern of sliders in FIG. 8D) correspondingly moves to the left along the rail region, and visual feedback is provided to the user to let the user know whether the control correctly recognized his or her intent. When the control detects that the slider has moved out of the track area from the entry end of the track area as the user's index finger moves, the control generates an event representing a "cancel" command, or generates no event, indicating that the user has not made an "acknowledge" indication, or indicating that the user has aborted or cancelled the original attempt. By processing an event representing a "cancel" command, the operation associated with the control is cancelled or ignored. The slider is hidden as it moves out of the guide rail area. Further, as the slider moves out of the rail region, the state of the control transitions to an inactive state and a cursor is drawn to track the user's gesture.
Those skilled in the art will appreciate that the rail area may have multiple exits. The control generates a "confirm" event when the slider is moved out of some of the outlets, and a "cancel" event when the slider is moved out of other outlets. To prompt the user for different meanings of the exit, the user may be provided with different indications of the meaning of the exit by visual, audio, and/or mechanical feedback at each exit direction.
In yet another embodiment according to the invention the guideway region has a plurality of outlets each indicating a different meaning or a plurality of outlets indicating a plurality of meanings. For example, when the slider is moved out of the first exit, the control generates a "cancel" event; when the slider moves out of the second outlet, the control generates a 'mute' event; when the slider is moved out of the third outlet, the control generates a "set to high sample rate" event; and when the slider is moved out of the fourth exit, the control generates a "set to low sample rate" event. And in response to receiving the event which is output by the control and indicates different meanings or different commands, the application program performs corresponding processing.
In another embodiment of the present invention, the controls are displayed in a three-dimensional user interface as follows. Initially, when the control is in an inactivated state, displaying a cursor, hiding an inlet area and displaying a guide rail area; and when the control is converted from the non-activated state to the activated state, weakening the gesture cursor, displaying the slide block on the guide rail area, and displaying the guide rail area and/or the guide rail line. The control triggers the special effect at the moment of entering the activated state, and the method comprises the following steps: gradually weakening the cursor and displaying the rail area and/or rail line. When the user confirms or cancels the operation associated with the current control through the control, another special effect is triggered, which comprises: and the slider is gradually weakened and disappears after being highlighted, the display of the cursor is gradually recovered, and the guide rail area and/or the guide rail line segment are gradually recovered to the mode of the control in the non-activated state.
In another embodiment according to the present invention, it is shown how gestures are utilized in an application to operate a control provided according to an embodiment of the present invention. (1) Displaying a control in an inactive state and a cursor associated with the gesture information (i) on a user interface of a display device; (2) The user changes the gesture or moves the hand position, and the change of a control and a cursor on the display equipment is observed; the user controls the cursor from the free side through the entry area into the rail area by changing the gesture or moving the hand position, and the control is activated. Upon activation of the control, the user interface prompts the user that the control is activated. (3) After the control is in the activated state, the user controls the slider by changing the gesture or moving the hand position, so that the slider moves along the guide rail area. If the user is to perform a confirmation operation, moving the slider out of the exit end of the guide rail area; if a cancelling operation is to be performed, the slide is moved out of the entry end of the guide rail area. Optionally, the user interface prompts the user for the performed operation when the confirmation operation and the cancellation operation are performed.
Fig. 9 is a block diagram of an information processing apparatus implementing an embodiment of the present invention. In an embodiment according to the present invention, the information processing device 900 generates a control on the user interface and recognizes the user gesture information (i) or receives gesture information (i) provided by the gesture input/gesture recognition device, recognizes the user's indication, and provides feedback to the user to interact with the user. The information processing apparatus 900 shown in fig. 9 is a computer. The computer is only one example of a suitable computing environment and is not intended to suggest any limitation as to the scope of use or functionality of the invention. Neither should the information handling device illustrated in FIG. 9 be interpreted as having any dependency or requirement relating to any one or combination of components illustrated.
The information processing apparatus 900 includes a memory 912, one or more processors 914, one or more presentation components 916, I/O components 920, and a power supply 922 coupled directly or indirectly to the bus 910. Bus 910 represents what may be one or more busses (such as an address bus, data bus, or combination thereof). It is not necessary that the various components be defined in a practical manner as in fig. 9. By way of example, a presentation component such as a display device can be considered an I/O component 920. Further, the processor may have a memory. The inventors hereof recognize that such is the nature of the art, and reiterate that the diagram of FIG. 9 is merely illustrative of an exemplary computer system that can be used in connection with one or more embodiments of the present invention.
The information processing apparatus 900 typically includes a variety of memories 912. By way of example, and not limitation, memory 912 may include: random Access Memory (RAM), read Only Memory (ROM), electronically Erasable Programmable Read Only Memory (EEPROM), flash memory, compact Disc Read Only Memory (CDROM), digital Versatile Discs (DVD) or other optical or holographic media, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. The computer storage media may be non-volatile.
Information processing apparatus 900 includes one or more processors 914 that read data from various entities such as bus 910, memory 912, or I/O components 920. One or more presentation components 916 present data indications to a user or other device. Exemplary presentation components 916 include a display device, speakers, a printing component, a vibrating component, a flat panel display, a projector, a head mounted display, and the like. The presentation component 916 can also be an I/O port for coupling to a display device, speakers, printing components, vibrating components, flat panel display, projector, head mounted display, and the like. Illustrative I/O components 920 include a camera, microphone, joystick, game pad, satellite dish, scanner, printer, wireless device, and the like.
Gesture recognition based controls according to the present invention may also be implemented in a gesture recognition device or a gesture input device. The gesture recognition device or gesture input device may be integrated into a keyboard, mouse, remote control, or like input device.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (10)

1. A man-machine interaction method based on gesture recognition is characterized in that a control recognizes a command of a user based on gesture information of the user, the control comprises an inlet area and a guide rail area, the inlet area divides a user interface into a first part and a second part, the first part does not comprise the guide rail area, and the second part comprises the guide rail area; the method comprises the following steps:
in response to a cursor moving from a first portion of the user interface to a second portion of the user interface through the entry area, the control enters an active state and displays a slider in a rail area of the control;
the control generates a first event in response to the slider moving out of the rail region from the first end of the rail region.
2. The method of claim 1, further comprising:
the control generates a second event in response to the slider moving out of the rail area from the second end of the rail area.
3. The method according to one of claims 1-2, further comprising:
the control enters an inactive state in response to the slider moving out of the rail area from the first end or the second end of the rail area.
4. Method according to one of claims 1 to 3, wherein
A cursor is displayed on the user interface in accordance with gesture information, and the gesture information indicates a position and/or a posture of the user's hand extracted from an image of the user's hand captured by the image capture device.
5. The method of one of claims 1 to 4, further comprising:
in response to the control entering the activated state, hiding a cursor and changing an appearance of the control on the user interface, playing a specified sound, displaying specified text, and/or providing mechanical feedback to prompt the user that the control enters the activated state.
6. The method of one of claims 1 to 5, further comprising:
and responding to the control to enter the inactivated state, displaying a cursor on the user interface according to the position supported by the gesture information, playing a specified sound, displaying a specified character, and/or providing mechanical feedback.
7. The method of one of claims 1-6, further comprising:
and in the activated state of the control, responding to the 'grabbing' action indicated by the gesture information, fixing a cursor on the sliding block, and drawing the sliding block according to the gesture information.
8. The method of any of claims 1-7, wherein the first end of the rail area is plural.
9. A human-computer interaction device based on gesture recognition is characterized in that a control recognizes a command of a user based on gesture information of the user, the control comprises an inlet area and a guide rail area, the inlet area divides a user interface into a first part and a second part, the first part does not comprise the guide rail area, and the second part comprises the guide rail area; the device comprises:
the activation module is used for responding to the cursor moving from the first part of the user interface to the second part of the user interface through the entrance area, enabling the control to enter an activation state, and displaying a sliding block in the guide rail area of the control;
an event generation module that generates a first event in response to the slider moving out of the rail region from the first end of the rail region.
10. An information processing apparatus comprising a processor, a memory, and a display device, the information processing apparatus further coupled to a gesture recognition apparatus and receiving gesture information provided by the gesture recognition apparatus;
the memory stores a program that, when executed by the processor, causes the information processing apparatus to perform the method of any one of claims 1-8.
CN201610878812.5A 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition Active CN107918481B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610878812.5A CN107918481B (en) 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610878812.5A CN107918481B (en) 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition

Publications (2)

Publication Number Publication Date
CN107918481A CN107918481A (en) 2018-04-17
CN107918481B true CN107918481B (en) 2022-11-11

Family

ID=61891617

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610878812.5A Active CN107918481B (en) 2016-10-08 2016-10-08 Man-machine interaction method and system based on gesture recognition

Country Status (1)

Country Link
CN (1) CN107918481B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108564901A (en) * 2018-06-22 2018-09-21 南京达斯琪数字科技有限公司 A kind of real time human-machine interaction holographic display system
CN111176500B (en) * 2018-11-13 2022-06-17 青岛海尔洗衣机有限公司 Display control method of slider in touch screen
CN112394811B (en) * 2019-08-19 2023-12-08 华为技术有限公司 Interaction method of air-separation gestures and electronic equipment
KR102693785B1 (en) 2020-09-11 2024-08-13 텐센트 테크놀로지(센젠) 컴퍼니 리미티드 Method and apparatus for positioning control in applications, devices and storage media
US11921931B2 (en) * 2020-12-17 2024-03-05 Huawei Technologies Co., Ltd. Methods and systems for multi-precision discrete control of a user interface control element of a gesture-controlled device
CN113760137B (en) * 2021-06-16 2022-08-05 荣耀终端有限公司 Cursor display method and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009108894A1 (en) * 2008-02-27 2009-09-03 Gesturetek, Inc. Enhanced input using recognized gestures
KR101154137B1 (en) * 2010-12-17 2012-06-12 곽희수 User interface for controlling media using one finger gesture on touch pad
EP2474950A1 (en) * 2011-01-05 2012-07-11 Softkinetic Software Natural gesture based user interface methods and systems
WO2012159254A1 (en) * 2011-05-23 2012-11-29 Microsoft Corporation Invisible control
EP2976690A1 (en) * 2013-03-21 2016-01-27 Sony Corporation Head-mounted device for user interactions in an amplified reality environment
KR20160046725A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Display device and method for controlling display device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8196042B2 (en) * 2008-01-21 2012-06-05 Microsoft Corporation Self-revelation aids for interfaces
US20110310010A1 (en) * 2010-06-17 2011-12-22 Primesense Ltd. Gesture based user interface
US9141262B2 (en) * 2012-01-06 2015-09-22 Microsoft Technology Licensing, Llc Edge-based hooking gestures for invoking user interfaces

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009108894A1 (en) * 2008-02-27 2009-09-03 Gesturetek, Inc. Enhanced input using recognized gestures
KR101154137B1 (en) * 2010-12-17 2012-06-12 곽희수 User interface for controlling media using one finger gesture on touch pad
EP2474950A1 (en) * 2011-01-05 2012-07-11 Softkinetic Software Natural gesture based user interface methods and systems
WO2012159254A1 (en) * 2011-05-23 2012-11-29 Microsoft Corporation Invisible control
EP2976690A1 (en) * 2013-03-21 2016-01-27 Sony Corporation Head-mounted device for user interactions in an amplified reality environment
KR20160046725A (en) * 2014-10-21 2016-04-29 삼성전자주식회사 Display device and method for controlling display device

Also Published As

Publication number Publication date
CN107918481A (en) 2018-04-17

Similar Documents

Publication Publication Date Title
CN107918481B (en) Man-machine interaction method and system based on gesture recognition
US11048333B2 (en) System and method for close-range movement tracking
US9910498B2 (en) System and method for close-range movement tracking
CN110476142B (en) Computing device, method and head mounted display device for displaying virtual content
CN105518575B (en) With the two handed input of natural user interface
CN107665042B (en) Enhanced virtual touchpad and touchscreen
US9685005B2 (en) Virtual lasers for interacting with augmented reality environments
AU2010366331B2 (en) User interface, apparatus and method for gesture recognition
US9734393B2 (en) Gesture-based control system
JP2013037675A5 (en)
CN107918482B (en) Method and system for avoiding overstimulation in immersive VR system
CN108536273A (en) Man-machine menu mutual method and system based on gesture
US20140068526A1 (en) Method and apparatus for user interaction
JP4513830B2 (en) Drawing apparatus and drawing method
EP2718900A2 (en) System for finger recognition and tracking
AU2012268589A1 (en) System for finger recognition and tracking
CN108459702B (en) Man-machine interaction method and system based on gesture recognition and visual feedback
CN115496850A (en) Household equipment control method, intelligent wearable equipment and readable storage medium
CN109144235B (en) Man-machine interaction method and system based on head-hand cooperative action
CN109144598A (en) Electronics mask man-machine interaction method and system based on gesture
KR101525011B1 (en) tangible virtual reality display control device based on NUI, and method thereof
CN109725722A (en) There are the gestural control method and device of screen equipment
JP4566123B2 (en) GAME SYSTEM, GAME PROGRAM, AND RECORDING MEDIUM CONTAINING THE PROGRAM
CN118331416A (en) Interaction method, device, equipment, medium and program product of virtual environment
JP2023042181A (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20191122

Address after: 300450 room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone (outside the ring), Binhai high tech Zone, Binhai New Area, Tianjin

Applicant after: TIANJIN SHARPNOW TECHNOLOGY Co.,Ltd.

Address before: 518000 A2, Shenzhen City, Guangdong Province, the 12 building of Kang Jia R & D building, south of science and technology south twelve

Applicant before: TIANJIN FENGSHI HUDONG TECHNOLOGY Co.,Ltd. SHENZHEN BRANCH

TA01 Transfer of patent application right
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210118

Address after: 518000 B1018, 99 Dahe Road, Runcheng community, Guanhu street, Longhua District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen laimile Intelligent Technology Co.,Ltd.

Address before: Room 203b-3, building 3, No.4, Haitai development road 2, Huayuan Industrial Zone, Binhai high tech Zone, Binhai New Area, Tianjin 300450

Applicant before: Tianjin Sharpnow Technology Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210917

Address after: 518000 509, xintengda building, building M8, Maqueling Industrial Zone, Maling community, Yuehai street, Nanshan District, Shenzhen City, Guangdong Province

Applicant after: Shenzhen qiaoniu Technology Co.,Ltd.

Address before: 518000 B1018, 99 Dahe Road, Runcheng community, Guanhu street, Longhua District, Shenzhen City, Guangdong Province

Applicant before: Shenzhen laimile Intelligent Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant