CN113918010A - Display apparatus and control method of display apparatus - Google Patents

Display apparatus and control method of display apparatus Download PDF

Info

Publication number
CN113918010A
CN113918010A CN202111070745.1A CN202111070745A CN113918010A CN 113918010 A CN113918010 A CN 113918010A CN 202111070745 A CN202111070745 A CN 202111070745A CN 113918010 A CN113918010 A CN 113918010A
Authority
CN
China
Prior art keywords
gesture
information
gesture recognition
recognition result
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111070745.1A
Other languages
Chinese (zh)
Inventor
刘兆磊
孙娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202111070745.1A priority Critical patent/CN113918010A/en
Publication of CN113918010A publication Critical patent/CN113918010A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to a display device and a control method of the display device, wherein the method comprises the following steps: acquiring a target image acquired by a camera; in response to the detection of the gesture information and the limb information in the target image, performing gesture recognition according to the gesture information, and judging whether a gesture recognition result contains a preset control gesture or not according to the limb information; and executing the control operation corresponding to the gesture recognition result in response to the gesture recognition result containing the control gesture. The gesture recognition accuracy of the display device can be effectively improved, and the problem that the display device makes wrong response due to gesture misrecognition and influences user experience in the related technology is effectively solved.

Description

Display apparatus and control method of display apparatus
Technical Field
The present application relates to the field of display technologies, and in particular, to a display device and a control method of the display device.
Background
With the development of artificial intelligence, interaction modes between users and display devices such as televisions are gradually diversified, and besides traditional remote controller interaction and voice interaction, some display devices such as televisions also start to support gesture interaction.
The display device is provided with a camera, images are shot through the camera, gesture recognition is carried out on the images, and response is carried out on the basis of recognition results. However, in the related art, the accuracy of gesture recognition of the display device is not high, and the situation that a gesture is recognized by mistake and a wrong response is made often occurs, so that the user experience is not good.
Disclosure of Invention
The application provides a display device and a control method of the display device, which can solve or at least partially solve the problems that the gesture recognition accuracy of the display device is low, wrong response is made due to gesture misrecognition, and user experience is influenced in the related technology.
In a first aspect, an embodiment of the present application provides a display device, including:
a display;
a camera;
and a controller configured to:
acquiring a target image acquired by the camera;
in response to the detection of the gesture information and the limb information in the target image, performing gesture recognition according to the gesture information, and judging whether a gesture recognition result contains a preset control gesture or not according to the limb information;
and in response to the gesture recognition result containing the control gesture, executing a control operation corresponding to the gesture recognition result.
In a second aspect, an embodiment of the present application provides a method for controlling a display device, including:
acquiring a target image acquired by a camera;
in response to the detection of the gesture information and the limb information in the target image, performing gesture recognition according to the gesture information, and judging whether a gesture recognition result contains a preset control gesture or not according to the limb information;
and in response to the gesture recognition result containing the control gesture, executing a control operation corresponding to the gesture recognition result.
In a third aspect, an embodiment of the present application further provides a computer-readable storage medium, where the storage medium stores a computer program, and the computer program is configured to execute the method for controlling a display device according to the embodiment of the present application.
According to the technical scheme, the display device and the control method of the display device can acquire the target image acquired by the camera, perform gesture recognition according to the gesture information in response to the detection of the gesture information and the limb information in the target image, and judge whether the gesture recognition result contains the preset control gesture according to the limb information; in addition, in response to the gesture recognition result including the control gesture, a control operation corresponding to the gesture recognition result is performed. According to the above mode provided by the embodiment of the application, whether the gesture recognition result contains the control gesture is judged together by combining the gesture information and the limb information, so that the false recognition probability generated by gesture recognition in the related technology can be reduced, the gesture recognition accuracy of the display device can be effectively improved, and the problem that the display device makes false response due to gesture false recognition in the related technology and influences user experience is solved.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present application, nor do they limit the scope of the present application. Other features of the present application will become apparent from the following description.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic illustration of an operational scenario between a display device and a control apparatus shown herein, in accordance with some exemplary embodiments;
fig. 2 is a block diagram illustrating a hardware configuration of the control apparatus 100 according to some exemplary embodiments of the present application;
fig. 3 is a block diagram illustrating a hardware configuration of a display device 200 according to some exemplary embodiments of the present application;
FIG. 4 is a diagram illustrating a software configuration in a display device 200 according to some exemplary embodiments of the present application;
FIG. 5 is a display diagram of an icon control interface of an application in the display device 200 shown in accordance with some demonstrative embodiments of the present application;
FIG. 6 is a flow chart illustrating a method of controlling a display device according to some exemplary embodiments of the present application;
FIG. 7 is a schematic illustration of a control gesture shown herein in accordance with an exemplary embodiment;
FIG. 8 is a schematic representation of identification of human key points shown in the present application according to an exemplary embodiment;
FIG. 9 is a schematic diagram of a coordinate system shown in the present application according to an exemplary embodiment;
FIG. 10 is a flow chart illustrating a method of controlling a display device according to an exemplary embodiment of the present application;
FIG. 11 is an interface schematic diagram illustrating a gesture recognition prompt according to an exemplary embodiment of the present application;
FIG. 12 is a schematic illustration of a gesture recognition completed interface according to an exemplary embodiment of the present application;
FIG. 13 is an interface schematic diagram illustrating a gesture recognition prompt according to an exemplary embodiment of the present application;
FIG. 14 is a schematic illustration of a gesture recognition completed interface according to an exemplary embodiment of the present application;
FIG. 15 is a flow chart illustrating a method of controlling a display device according to an exemplary embodiment of the present application;
fig. 16 is a flowchart illustrating a control method of a display apparatus according to an exemplary embodiment of the present application.
Detailed Description
To make the purpose and embodiments of the present application clearer, the following will clearly and completely describe the exemplary embodiments of the present application with reference to the attached drawings in the exemplary embodiments of the present application, and it is obvious that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first," "second," "third," and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and not necessarily for describing a particular sequential or chronological order, unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances.
The terms "comprises" and "comprising," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or apparatus that comprises a list of elements is not necessarily limited to all elements expressly listed, but may include other elements not expressly listed or inherent to such product or apparatus.
The display device provided by the embodiment of the present application may have various implementation forms, and for example, the display device may be a television, a smart television, a laser projection device, a display (monitor), an electronic whiteboard (electronic whiteboard), an electronic desktop (electronic table), and the like. Fig. 1 and 2 are specific embodiments of a display device of the present application.
Fig. 1 is a schematic diagram of an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through the smart device 300 or the control device 100.
In some embodiments, the control apparatus 100 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, and controls the display device 200 in a wireless or wired manner. The user may input a user instruction through a key on a remote controller, voice input, control panel input, etc., to control the display apparatus 200.
In some embodiments, the smart device 300 (e.g., mobile terminal, tablet, computer, laptop, etc.) may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device.
In some embodiments, the display device may not receive instructions using the smart device or control device described above, but rather receive user control through touch or gestures, or the like.
In some embodiments, the display device 200 may also be controlled in a manner other than the control apparatus 100 and the smart device 300, for example, the voice command control of the user may be directly received by a module configured inside the display device 200 to obtain a voice command, or may be received by a voice control device provided outside the display device 200.
In some embodiments, the display device 200 is also in data communication with a server 400. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers.
Fig. 2 exemplarily shows a block diagram of a configuration of the control apparatus 100 according to an exemplary embodiment. As shown in fig. 2, the control device 100 includes a controller 110, a communication interface 130, a user input/output interface 140, a memory, and a power supply. The control apparatus 100 may receive an input operation instruction from a user and convert the operation instruction into an instruction recognizable and responsive by the display device 200, serving as an interaction intermediary between the user and the display device 200.
As shown in fig. 3, the display apparatus 200 includes at least one of a tuner demodulator 210, a communicator 220, a detector 230, an external device interface 240, a controller 250, a display 260, an audio output interface 270, a memory, a power supply, and a user interface.
In some embodiments the controller comprises a processor, a video processor, an audio processor, a graphics processor, a RAM, a ROM, a first interface to an nth interface for input/output.
The display 260 includes a display screen component for presenting a picture, and a driving component for driving image display, a component for receiving an image signal from the controller output, performing display of video content, image content, and a menu manipulation interface, and a user manipulation UI interface.
The display 260 may be a liquid crystal display, an OLED display, and a projection display, and may also be a projection device and a projection screen.
The communicator 220 is a component for communicating with an external device or a server according to various communication protocol types. For example: the communicator may include at least one of a Wifi module, a bluetooth module, a wired ethernet module, and other network communication protocol chips or near field communication protocol chips, and an infrared receiver. The display apparatus 200 may establish transmission and reception of control signals and data signals with the external control apparatus 100 or the server 400 through the communicator 220.
A user interface for receiving control signals for controlling the apparatus 100 (e.g., an infrared remote control, etc.).
The detector 230 is used to collect signals of the external environment or interaction with the outside. For example, detector 230 includes a light receiver, a sensor for collecting ambient light intensity; alternatively, the detector 230 includes an image collector, such as a camera, which may be used to collect external environment scenes, attributes of the user, or user interaction gestures, or the detector 230 includes a sound collector, such as a microphone, which is used to receive external sounds.
The external device interface 240 may include, but is not limited to, the following: high Definition Multimedia Interface (HDMI), analog or data high definition component input interface (component), composite video input interface (CVBS), USB input interface (USB), RGB port, and the like. The interface may be a composite input/output interface formed by the plurality of interfaces.
The tuner demodulator 210 receives a broadcast television signal through a wired or wireless reception manner, and demodulates an audio/video signal, such as an EPG data signal, from a plurality of wireless or wired broadcast television signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box.
The controller 250 controls the operation of the display device and responds to the user's operation through various software control programs stored in the memory. The controller 250 controls the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object to be displayed on the display 260, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments the controller comprises at least one of a Central Processing Unit (CPU), a video processor, an audio processor, a Graphics Processing Unit (GPU), a RAM Random Access Memory (RAM), a ROM (Read-Only Memory), a first to nth interface for input/output, a communication Bus (Bus), and the like.
A user may input a user command on a Graphical User Interface (GUI) displayed on the display 260, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
A "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables the conversion of the internal form of information to a form acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
Referring to fig. 4, in some embodiments, the system is divided into four layers, which are an Application (Applications) layer (abbreviated as "Application layer"), an Application Framework (Application Framework) layer (abbreviated as "Framework layer"), an Android runtime (Android runtime) and system library layer (abbreviated as "system runtime library layer"), and a kernel layer from top to bottom.
In some embodiments, at least one application program runs in the application program layer, and the application programs may be windows (windows) programs carried by an operating system, system setting programs, clock programs or the like; or an application developed by a third party developer. In particular implementations, the application packages in the application layer are not limited to the above examples.
In some embodiments, the framework layer provides an Application Programming Interface (API) and a programming framework for applications. The application framework layer includes a number of predefined functions. The application framework layer acts as a processing center that decides to let the applications in the application layer act. The application program can access the resources in the system and obtain the services of the system in execution through the API interface.
As shown in fig. 4, in the embodiment of the present application, the application framework layer includes a manager (Managers), a Content Provider (Content Provider), and the like, where the manager includes at least one of the following modules: an Activity Manager (Activity Manager) is used for interacting with all activities running in the system; the Location Manager (Location Manager) is used for providing the system service or application with the access of the system Location service; a Package Manager (Package Manager) for retrieving various information related to an application Package currently installed on the device; a Notification Manager (Notification Manager) for controlling display and clearing of Notification messages; a Window Manager (Window Manager) is used to manage the icons, windows, toolbars, wallpapers, and desktop components on a user interface.
In some embodiments, the activity manager is used to manage the lifecycle of the various applications as well as general navigational fallback functions, such as controlling exit, opening, fallback, etc. of the applications. The window manager is used for managing all window programs, such as obtaining the size of a display screen, judging whether a status bar exists, locking the screen, intercepting the screen, controlling the change of the display window (for example, reducing the display window, displaying a shake, displaying a distortion deformation, and the like), and the like.
In some embodiments, the system runtime layer provides support for the upper layer, i.e., the framework layer, and when the framework layer is used, the android operating system runs the C/C + + library included in the system runtime layer to implement the functions to be implemented by the framework layer.
In some embodiments, the kernel layer is a layer between hardware and software. As shown in fig. 4, the core layer includes at least one of the following drivers: audio drive, display driver, bluetooth drive, camera drive, WIFI drive, USB drive, HDMI drive, sensor drive (like fingerprint sensor, temperature sensor, pressure sensor etc.) and power drive etc..
In some embodiments, the display device may directly enter the interface of the preset vod program after being activated, and the interface of the vod program may include at least a navigation bar 510 and a content display area located below the navigation bar 510, as shown in fig. 5, where the content displayed in the content display area may change according to the change of the selected control in the navigation bar. The programs in the application program layer can be integrated in the video-on-demand program and displayed through one control of the navigation bar, and can also be further displayed after the application control in the navigation bar is selected.
In some embodiments, the display device may directly enter a display interface of a signal source selected last time after being started, or a signal source selection interface, where the signal source may be a preset video-on-demand program, or may be at least one of an HDMI interface, a live tv interface, and the like, and after a user selects different signal sources, the display may display contents obtained from different signal sources.
In some embodiments, when the display device is used to implement a smart tv function or a video playing function, different tv programs or different video files, audio files, etc. may be played in the display device. During use of the display device, a user may control the display device, for example, play video, pause video, mute, change display content, and so forth.
At present, the mainstream mode of controlling the display equipment by a user is remote controller control, along with the development of artificial intelligence, part of the display equipment starts to support the user to adopt gestures for control, and the display equipment can be externally provided with a camera or internally provided with a camera, shoots images through the camera, identifies the gestures and responds based on an identification result. The display device in the related art mainly adopts a gesture recognition technology to perform gesture recognition on a shot image, but a situation of false recognition is very easy to occur, such as a situation where a gesture corresponding to a certain control instruction (exemplarily illustrated by a pause playing instruction) is preset to be "five fingers are closed", specifically, it is expected that a user lifts a single hand with the five fingers closed, that is, confirms that the user sends the pause playing instruction, and then pauses playing a program. However, the existing display device only detects a gesture that five fingers are close together in an image through a gesture recognition technology, that is, it is considered that a user initiates a play pause instruction, so that the current program is directly paused. However, the user may only make the five fingers close together unintentionally, such as when the user only covers the chest with one hand with the five fingers close together, or when the user only naturally drops his hand with the five fingers close together, the user does not intend to pause the program played by the display device, but the user may experience a bad feeling due to the wrong recognition of the gesture by the display device and the wrong response. It is to be understood that the above is merely illustrative for ease of understanding and should not be taken as limiting.
In order to solve the above problems, embodiments of the present application provide a display device and a control method for the display device, which can effectively improve gesture recognition accuracy of the display device, reduce false recognition probability, and effectively alleviate the problem that user experience is affected due to false response caused by gesture false recognition.
In addition, the term "gesture" used in the embodiments of the present application refers to a user behavior for expressing a desired idea, action, purpose and/or result by a change of a hand shape or an action such as a hand motion.
Fig. 6 shows a flow chart of a method of controlling a display device according to some embodiments, the method being executable by a controller of the display device, the display device comprising a display, a camera (both internal and external), and the controller, the controller being configured to mainly perform steps S602-S608 as follows:
and step S602, acquiring a target image acquired by the camera.
In some embodiments, after the gesture control function is turned on, a target image is acquired from a video frame captured by the camera at a first specified frequency. For example, the first prescribed frequency is set to every N frames/time. The value of N can be flexibly set according to requirements, and an exemplary value of N is 10, that is, one frame of image is obtained from each 10 frames of acquired images of the camera as a target image.
Step S604, responding to the detected gesture information and the detected limb information in the target image, performing gesture recognition according to the gesture information, and judging whether the gesture recognition result contains a preset control gesture according to the limb information.
In some embodiments, gesture information in the target image may be detected using a gesture detection algorithm and limb information in the target image may be detected using a limb detection algorithm. In some specific implementation examples, the gesture information includes location information of a plurality of gesture key points, and the limb information includes location information of a plurality of limb key points.
In some embodiments, after detecting that the target image includes the gesture information and the limb information, a gesture recognition algorithm may be used to perform gesture recognition according to the gesture information to obtain a gesture recognition result, for example, the gesture recognition result includes a specific recognized gesture type (or an identifier corresponding to the gesture type), and whether the recognized gesture type is a control gesture may be further determined by combining the limb information.
Step S606, in response to the gesture recognition result including the control gesture, executing a control operation corresponding to the gesture recognition result.
In some embodiments, the control gesture may be preset, and the control gesture may be set by the manufacturer of the display device or may be set by the user. For ease of understanding, reference may be made to the schematic illustration of the control gesture shown in fig. 7, which simply illustrates several preset control gesture types, which are merely illustrative and should not be considered as limiting. In practical applications, various control gestures may be flexibly set, and a corresponding relationship between the control gestures and the control operations may be set, for example, a gesture index may be pre-established, where the gesture index includes the control gestures and the control operations, and the control operations may be regarded as control operations corresponding to gesture recognition results, and may also be understood as response modes for the control gestures, such as but not limited to mute control, program switching control, photographing control, and the like, which is not limited herein.
According to the above mode provided by the embodiment of the application, whether the gesture recognition result contains the control gesture is judged together by combining the gesture information and the limb information, so that the false recognition probability generated by gesture recognition in the related technology can be reduced, the gesture recognition accuracy of the display device can be effectively improved, and the problem that the display device makes false response due to gesture false recognition in the related technology and influences user experience is solved.
The embodiment of the present application further provides a corresponding manner that the gesture recognition result does not include a control gesture, as shown in fig. 6, the controller is further configured to execute the following step S608: in response to the gesture recognition result not including the control gesture, performing one or more of the following operations: and keeping the current running state, initiating prompt information of gesture recognition failure or acquiring a new target image acquired by the camera.
In some embodiments, the user only presets that a part (or all) of the functions of the remote controller is replaced by a part (or all) of the gestures, such as setting a confirmation key of the remote controller by replacing an "OK" gesture, setting a "hand swing" gesture to replace a program switching key of the remote controller, and the like, the user sets the display device to keep the gesture recognition function, directly executes response operation when the control gesture is recognized, and continues to operate when the control gesture is not recognized, and keeps the operation parameters unchanged. At this time, if the gesture recognition result does not contain the control gesture, it is stated that the gesture in the gesture recognition result is another gesture that the user has made carelessly, and the user is not currently intending to control the display device through the gesture, so the display device does not need to respond at this time, and the current running state (such as the current program continues to be played) can be continuously maintained without changing any current running parameter.
In some embodiments, the user invokes a gesture control application, such as a photographing application or another application that requires a gesture to perform a subsequent operation, and if the gesture recognition result does not include a control gesture, it indicates that the user may make an irregular gesture at this time, and the display device does not correctly recognize the control gesture made by the user, so that a prompt message indicating that the gesture recognition fails may be specifically initiated. The current running state may also be maintained on the basis of the initiation of the reminder message, such as to continue playing the current program.
In some embodiments, when the current target image does not include the control gesture, the controller acquires a new target image again (i.e., returns to step S602, and continues to perform steps S604 to S608 with respect to the new target image.
In some embodiments, in the step of determining whether the gesture recognition result includes a preset control gesture according to the limb information, the controller is further configured to perform the following steps a and B:
and step A, responding to the gesture recognition result containing the designated gesture, and acquiring the arm direction corresponding to the designated gesture based on the limb information.
The designated gesture may be set by the manufacturer of the display device or may be set by the user. In a specific implementation example, the designated gesture includes a control gesture, and the requirement limit of the control gesture is stricter than that of the designated gesture, such as, for example, taking the type of the designated gesture including a "five-finger-close" gesture, the gesture direction is not defined in the designated gesture, that is, as long as the gesture including the five fingers close together in the target image is recognized, no matter what direction the five fingers are, the gesture recognition result of the target image is considered to include the designated gesture; however, the control gesture not only limits the gesture type, but also limits the gesture direction, and only if the gesture of "five fingers close together" with the five fingers facing upwards, the gesture is considered to belong to the control gesture. The inventor finds in research that in the related art, as long as a specified gesture is detected, corresponding control operation is immediately executed, and the gesture direction or the limb direction is ignored, so that the probability of misidentification is high, and the gesture action which is not intended by the user is mistaken as a control instruction initiated by the gesture of the user and is mistaken for response. In contrast, the designated gesture is detected firstly, the arm direction corresponding to the designated gesture is further acquired on the basis, and whether the designated gesture is the control gesture or not can be effectively distinguished by combining the arm direction, so that the probability of misidentification is effectively reduced.
In some specific implementation examples, the limb information comprises position information of a plurality of arm key points; the gesture information includes position information of a plurality of hand key points. For easy understanding, reference may be briefly made to a schematic diagram of identification of key points of a human body as shown in fig. 8, and it should be noted that fig. 8 only exemplarily marks some key points of a human body, does not mark all key points, and since a hand is small, the hand key points are not marked in fig. 8 either to avoid the labeling clutter and thus are inconvenient to understand.
Based on this, step a can be realized with reference to steps a1 to A3 as follows:
in step a1, position information of a specified hand key point corresponding to a specified gesture is extracted from the gesture information. The designated hand key point is one or more of a plurality of hand key points included in the gesture information. Illustratively, the designated hand keypoints may be 4 and 7 points in fig. 8, or points close to 4/7 points.
Step a2, extracting the position information of the arm key points corresponding to the designated hand key points from the limb information according to the position information of the designated hand key points.
It is understood that more than one hand or limb may be detected in the target image, and therefore in the embodiment of the present application, it is mainly to acquire the limb corresponding to the specified gesture in the target image, such as the specified gesture is a left hand, and the corresponding limb is a left arm. By extracting the position information of the designated hand key point corresponding to the designated gesture, the position information of the arm key point corresponding to the designated gesture can be further extracted based on the position information.
In a specific implementation example, the arm key points within a preset distance range from the designated hand key point may be used as the arm key points corresponding to the designated hand key point. Illustratively, assume that the designated hand keypoints are the leftmost keypoints and the rightmost keypoints of the hand shape, the abscissa of the leftmost keypoint is handinfo. left, the abscissa of the rightmost keypoint is handinfo. right, and the arm keypoint is the keypoint closest to the hand, such as 4-point in fig. 8, and the abscissa of the 4-point is Skelenton4_ X. Illustratively, if:
skelenton4 — X > (handinfo. left-OFFSET), and,
skelenton4_ X < [ (handinfo).
If the relation is satisfied, the arm where the 4 points are located is the arm corresponding to the designated gesture, and the position information of the key points on the arm recorded in the limb information is the position information of the arm key points corresponding to the designated hand key points.
Step a3, based on the extracted position information of the arm key points, determines the arm direction corresponding to the designated gesture.
The arm keypoints corresponding to the designated hand keypoints are typically multiple, such as the right arm keypoints 2-3-4 points in FIG. 8, or the left arm keypoints 5-6-7 points. The position information can be realized by adopting a coordinate system coordinate mode based on a pre-established coordinate system. For example, referring to a schematic diagram of a coordinate system shown in fig. 9, the coordinates of the origin at the upper left corner are (0, 0), the X-axis faces to the right, and the Y-axis faces downward; the coordinates of the respective key points can be determined based on the coordinate system, so as to determine the positions of the hand and the limbs, such as that the coordinate of the wrist point 4 is (100 ), the coordinate position of the wrist is (100 ), the coordinate of the elbow point 3 is (120, 150), the coordinate position of the elbow is (120, 150), and of course, if the detected key point has a value of (-1, -1) or the coordinate value of the key point is not normally detected, the key point is not detected. Of course, the above is merely an example, and the coordinate system may be flexibly set according to the requirement in practical application, such as setting the Y axis of the coordinate system upward.
After extracting the position coordinates of the arm key points, the arm direction corresponding to the designated gesture can be determined based on the position coordinates, for example, the arm direction is determined according to the relative position relationship between a plurality of key points on the arm, such as the right arm key point 3 located above the right arm key point 4 in fig. 8, which indicates that the right arm is drooping, and if the right arm key point 3 is located below the right arm key point 4, which indicates that the right arm is lifted. In the specific implementation, the comparison needs to be performed based on the coordinates of the key points corresponding to the coordinate system, and the comparison mode is related to the setting direction of the coordinate system. Taking the coordinate system of fig. 9 as an example, coordinates of the right arm keypoint 3 are recorded as (skeletton 3_ X, skeletton 3_ Y), coordinates of the right arm keypoint 4 are recorded as (skeletton 4_ X, skeletton 4_ Y), coordinates of the left arm keypoint 6 are recorded as (skeletton 6_ X, skeletton 6_ Y), coordinates of the left arm keypoint 7 are recorded as (skeletton 7_ X, skeletton 7_ Y), if skeletton 3_ Y > skeletton 4_ Y, the left arm is indicated to be lifted upwards, and if skeletton 6_ Y > skeletton 7_ Y, the right arm is indicated to be lifted upwards. The above is only an example based on the form of the coordinate system in fig. 9, and in practical applications, the corresponding alignment manner may be determined according to the established coordinate system direction, such as if the skeletton 3_ Y > skeletton 4_ Y indicates that the left arm hangs down when the Y axis of the coordinate system is upward.
Further, the specific inclination angle of the arm can be further determined according to the specific position coordinates of the key points of the arm, which is not described herein again.
And B, responding to the fact that the arm direction and the designated gesture belong to a preset combination, and determining that the gesture recognition result contains a preset control gesture. It is also understood that the designated gesture is determined to be a preset control gesture.
In practical applications, a combination of the arm direction and the designated gesture may be preset, for example, the arm direction corresponding to each designated gesture is set, and only when the detected arm direction is consistent with the preset arm direction corresponding to the designated gesture, the gesture detected from the target image is determined to be the control gesture.
In some embodiments, a plurality of control gestures may be preset, and the control operation corresponding to each control gesture and the corresponding arm direction may be set. And if the specified gesture contained in the target image and the corresponding arm direction belong to the preset combination through detection, indicating that the gesture recognition result contains the preset control gesture.
Through the mode, the body direction and the gesture type are jointly analyzed, the recognition accuracy of the control gesture can be fully improved, and the problem that in the related technology, only the gesture type is recognized to cause wrong response can be well relieved.
On the basis of the foregoing, in order to further prompt the gesture recognition accuracy and reduce the false recognition probability, fig. 10 shows a flowchart of a control method of a display device according to some embodiments, where the method is executable by a controller of the display device, the display device includes a display, a camera (both internal and external), and the controller is configured to mainly perform the following steps S1002 to S1010:
and step S1002, after the gesture control function is started, acquiring a target image acquired by the camera.
Step S1004, in response to the detection of the gesture information and the limb information in the target image, performing gesture recognition according to the gesture information, and determining whether the gesture recognition result includes a preset control gesture according to the limb information.
Step S1006, in response to the gesture recognition result containing the control gesture, obtaining a confidence of the gesture recognition result.
In some embodiments, when the gesture recognition is performed by using a gesture recognition algorithm, the recognized gesture is selected in a frame, a confidence reference value is given, and the confidence reference value of the gesture recognition can be directly used as the confidence of the gesture recognition result.
In other embodiments, a hold duration of the control gesture may be set, and a confidence level of the gesture recognition result may be determined based on the hold duration, wherein the hold duration is proportional to the confidence level. For example, the longer the hold duration, the higher the confidence level of the description, the more likely the user is currently employing the control gesture, while the shorter the hold duration, the lower the confidence level, the more likely the user is simply inadvertently making a gesture. On the basis, illustratively, a plurality of frames of images can be continuously acquired through the camera, and the duration of the control gesture can be judged by detecting the gesture in the subsequent frames of images.
In step S1008, in response to the confidence level of the gesture recognition result being greater than or equal to the preset confidence threshold, a control operation corresponding to the gesture recognition result is executed.
Step S1010, in response to the gesture recognition result not including the control gesture, or in response to the confidence of the gesture recognition result being less than the confidence threshold, performing one or more of the following operations: and keeping the current running state, initiating prompt information of gesture recognition failure or acquiring a new target image acquired by the camera.
Through the above mode provided by the embodiment of the application, whether the gesture recognition result contains the control gesture is judged together by combining the gesture information and the limb information, whether corresponding control operation is executed can be further judged according to the confidence coefficient of the gesture recognition result, so that the false recognition probability generated by gesture recognition in the related technology is further reduced, the gesture recognition accuracy of the display equipment can be effectively improved, and the problem that the user experience is influenced because the display equipment makes false response due to gesture false recognition in the related technology is solved.
In some embodiments, the controller of the display device is configured to acquire the target image from the video frames captured by the camera at a first specified frequency after the gesture control function is turned on. On the basis, in the step of obtaining the confidence of the gesture recognition result, the controller is further configured to execute the following steps a to d:
step a, in response to the fact that the target image is the first frame image for confirming that the gesture recognition result belongs to the control gesture, acquiring multiple frames of images to be detected from video frames collected by a camera at a second specified frequency within a specified time length after the target image is acquired; wherein the second specified frequency is greater than the first specified frequency. The specified time period can be flexibly set (usually not too long), such as only 500 ms.
The controller of the display device is configured to acquire a target image from the video frames acquired by the camera at a first specified frequency after the gesture control function is started, such as acquiring the target image from the video frames acquired by the camera at a frequency of 10 frames/time, that is, the controller acquires one frame of image from the video frames acquired by the camera as a target image to be detected and performs gesture detection and recognition and limb detection and recognition operations on the target image, it can be understood that a gesture is not detected in some target images, the gesture detected in some target images is not a control gesture, when the controller detects the control gesture from the target image for the first time, the target image is a first frame image of which the gesture recognition result belongs to the control gesture, and the controller of the display device is changed to acquire the image to be detected from the video frames acquired by the camera at a second specified frequency, for example, an image to be detected is acquired from a video frame captured by a camera at a frequency of 1 frame/time.
B, acquiring the number of effective image frames containing control gestures in a plurality of frames of images to be detected; and acquiring the total image frame number of the plurality of frames of the images to be detected.
And c, calculating the ratio of the effective image frame number to the total image frame number, and obtaining the confidence coefficient of the gesture recognition result based on the ratio. In some specific implementation examples, the ratio may be directly used as the confidence of the gesture recognition result. The greater the ratio, the longer the duration of the control gesture, the more likely the user is to actually currently wag the control gesture.
After the confidence of the gesture recognition result is obtained, the confidence of the gesture recognition result can be compared with a preset confidence threshold value, and therefore response is conducted in a targeted mode.
In one specific implementation example, in response to the confidence level being greater than or equal to the confidence threshold, a control operation corresponding to the gesture recognition result is performed. In one particular implementation example, new target images are acquired from video frames captured by the camera at a first specified frequency in response to the confidence level being less than the confidence threshold. That is, when it is determined that the control gesture is not included under the current condition, the gesture detection frame rate is adjusted again, and gesture detection is performed at a lower frequency, so that the CPU loss is reduced.
By the method, the accuracy of gesture recognition and the timeliness of the triggering gesture can be effectively guaranteed. In addition, the gesture detection frame rate can be automatically adjusted in the above manner, when the control gesture is not detected, the frequency of image detection performed by the controller is lower than the image detection frequency after the control gesture is detected, and compared with the gesture detection performed only at the fixed high-frequency detection frame rate in the related art, the above manner provided by the embodiment of the application can effectively reduce the CPU loss. Specifically, by adjusting the gesture detection frame rate, the gesture detection accuracy can be guaranteed by detecting the gesture at a higher detection frame rate, and meanwhile, when the gesture is not detected, the gesture detection frame rate is reduced to reduce the utilization rate of the CPU, so that the CPU loss is effectively reduced.
In some embodiments, the controller is further configured to: displaying identification information on a display in the process of performing gesture recognition according to the gesture information and judging whether a gesture recognition result contains a preset control gesture according to the limb information; the identification information is used to indicate that gesture recognition is in progress. For ease of understanding, two implementation examples in different scenarios are given below.
In some specific implementation examples, a user specifically starts an application requiring gesture control, such as a photographing application like "magic mirror", at this time, reference may be made to an interface schematic diagram of a gesture recognition prompt shown in fig. 11, which illustrates that a display device may purposely indicate that the display device is currently in gesture recognition through a complete prompt interface, and at this time, the interface has no other content except for prompt information. After the recognition is completed, see also a schematic diagram of an interface of the gesture recognition completed as shown in fig. 12.
In some specific implementation examples, the user sets the display device to keep the gesture recognition function, directly executes the response operation when the control gesture is recognized, and continues to operate when the control gesture is not recognized, keeping the operation parameters unchanged. Therefore, the display device can keep the gesture detection state all the time during normal operation (such as during normal playing of a program), in this case, the interface diagram of a gesture recognition prompt shown in fig. 13 can be referred to, and only the gesture recognition is displayed in the upper left corner of the interface, so as to avoid disturbing the user. After the recognition is successful, see a schematic diagram of an interface with gesture recognition completed as shown in fig. 14, assuming that, for example, the control operation corresponding to the detected control gesture is to switch the program type, as shown in fig. 14, the program type has been successfully switched from the "education" type to the "movie" type, and an operation success prompt is displayed.
It should be noted that the foregoing is merely illustrative and should not be considered as limiting. In practical application, the prompt mode can be flexibly set according to requirements.
Further, based on the foregoing control method for a display device, for convenience of understanding, taking control gestures in two different scenes as an example, the embodiment of the present application provides two application examples.
Application example one:
in some embodiments, the display device is provided with a global gesture function, and a user can start the function in the setting, wherein the function supports the user to execute a corresponding control operation through a preset control gesture. In a specific implementation example, refer to a flowchart of a control method of a display device shown in fig. 15, where the display device includes a display, a camera (which may be internal or external), and a controller, and the controller is configured to mainly perform the following steps S1502 to S1516:
in step S1502, a global gesture function is activated. Illustratively, a five-finger gesture that lifts upward may be registered as a basis for initiating a global gesture function.
Step S1504, a gesture detection and recognition algorithm is started.
In step S1506, it is determined whether the gesture is a five-finger gesture. If so, go to step S1508; if not, return to execute step S1504.
In step S1508, a limb detection and identification algorithm is initiated.
Step S1510, determine whether the corresponding limb of the five-finger gesture faces upward. If yes, go to step S1512; if not, return to execute step S1504.
In step S1512, the confidence level of the five-finger gesture is calculated.
Step S1514, determine whether the confidence level is greater than a preset confidence threshold. If yes, go to step S1516, if no, return to step S1504.
And step S1516, initiating prompt information through the display interface. Illustratively, confirming that the global gesture function is successfully turned on, a 500ms buffer interface may be provided to prompt the user that the user is recognizing at this time, and after the recognition is completed, the corresponding function key (such as an OK key) of the remote controller may be simulated to control the display device, and the user may also be prompted through the display interface that the recognition is completed. See in particular the aforementioned fig. 11 and 12.
In the global gesture application scenario, operations such as media/slide playing or other operations may be executed, and only the corresponding relationship between the control gesture and the response mode of the display device needs to be set in advance, which is not limited herein.
Application example two:
in some embodiments, the display device is provided with a gesture control photographing function, which can be realized by a photographing application such as a named 'magic mirror', and gesture photographing is used as a triggering photographing mode in the display device provided with the camera. The user can start the function in the setting, and the function supports the user to start the photographing operation through a preset control gesture. In a specific implementation example, refer to a flowchart of a control method of a display device shown in fig. 16, where the display device includes a display, a camera (which may be internal or external), and a controller, and the controller is configured to mainly perform the following steps S1602 to S1616:
step S1602, start the gesture photographing function, and register the upward-lifted V-shaped gesture as a photographing triggering gesture.
Step S1604, acquiring images from the video frames collected by the camera according to the frequency of 10 frames/time, and starting a gesture detection and recognition algorithm to obtain a gesture recognition result of the images.
Step S1606, judging whether the gesture recognition result contains a V-shaped gesture; if yes, go to step S1608, if no, go to step S1604;
step S1608, acquiring an image from the video frame acquired by the camera with the frequency adjusted to 1 frame/time, and starting a timer. Illustratively, the timer is set to a timing of 500 ms.
In step S1610, the number of valid frames including the V-shaped gesture detected during the timer period and the total number of frames acquired from the camera are recorded. Because the gesture detection frame rate is increased, the gesture confidence degree can be judged quickly and the result can be fed back.
In step S1612, the confidence of the gesture recognition result is determined according to the ratio of the number of valid frames to the total number of frames.
In step S1614, it is determined whether the confidence level is greater than a preset confidence threshold. If yes, go to step S1616, if no, go to step S1604;
in step S1616, the photographing operation is started, and the process returns to step S1604, that is, the frequency recovered to 10 frames/time is obtained from the video frames captured by the camera for gesture detection and recognition.
It should be noted that the above two examples are only schematic illustrations of simple scenarios applied on the basis of the control method of the display device shown in fig. 6, and should not be considered as limitations, and in practical applications, more or fewer steps may be included, and each step may also be different, and is not limited herein.
According to the technical scheme, the display device and the control method of the display device provided by the embodiment of the application judge whether the gesture recognition result contains the control gesture or not through combining the gesture information and the limb information, so that the false recognition probability generated by gesture recognition in the related technology can be reduced, the gesture recognition accuracy of the display device can be effectively improved, and the problem that the display device makes false response due to gesture false recognition in the related technology and influences user experience is solved.
Furthermore, by introducing the confidence of the gesture recognition result, the control operation corresponding to the gesture recognition result is executed only when the confidence is higher than a preset confidence threshold, so that the gesture recognition accuracy can be further improved, and the probability of false recognition and false response is reduced.
Furthermore, by adjusting the gesture detection frame rate, the gesture detection can be performed at a higher detection frame rate when the gesture is detected so as to ensure the accuracy of the gesture detection, and meanwhile, when the gesture is not detected, the gesture detection frame rate is reduced so as to reduce the utilization rate of the CPU, thereby effectively reducing the CPU loss.
In addition to the above methods and apparatuses, embodiments of the present application may also be a computer program product including computer program instructions that, when executed by a controller, cause the controller to perform the control method of the display apparatus provided by the embodiments of the present application.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, an embodiment of the present application may also be a computer-readable storage medium having stored thereon computer program instructions, which, when executed by a controller, cause the controller to execute the control method of the display device provided by the embodiment of the present application.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
The foregoing description, for purposes of explanation, has been presented in conjunction with specific embodiments. However, the illustrative discussions above are not intended to be exhaustive or to limit the embodiments to the precise forms disclosed above. Many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles and the practical application, to thereby enable others skilled in the art to best utilize the embodiments and various embodiments with various modifications as are suited to the particular use contemplated.

Claims (10)

1. A display device, comprising:
a display;
a camera;
and a controller configured to:
acquiring a target image acquired by the camera;
in response to the detection of the gesture information and the limb information in the target image, performing gesture recognition according to the gesture information, and judging whether a gesture recognition result contains a preset control gesture or not according to the limb information;
and in response to the gesture recognition result containing the control gesture, executing a control operation corresponding to the gesture recognition result.
2. The display device according to claim 1, wherein in the step of determining whether the gesture recognition result includes a preset control gesture according to the limb information, the controller is further configured to:
responding to a gesture recognition result that a designated gesture is contained, and acquiring an arm direction corresponding to the designated gesture based on the limb information;
and determining that the gesture recognition result contains a preset control gesture in response to the fact that the arm direction and the designated gesture belong to a preset combination.
3. The display device according to claim 2, wherein the limb information includes position information of a plurality of arm key points; the gesture information comprises position information of a plurality of hand key points;
in the step of acquiring the arm direction corresponding to the designated gesture based on the limb information, the controller is further configured to:
extracting position information of a designated hand key point corresponding to the designated gesture from the gesture information;
extracting the position information of the arm key points corresponding to the designated hand key points from the limb information according to the position information of the designated hand key points;
and determining the arm direction corresponding to the designated gesture based on the extracted position information of the arm key point.
4. The display device according to claim 1, wherein in the step of acquiring the target image captured by the camera, the controller is further configured to:
and after the gesture control function is started, acquiring a target image from the video frame acquired by the camera at a first specified frequency.
5. The display device according to claim 4, wherein in the step of performing the control operation corresponding to the gesture recognition result, the controller is further configured to:
obtaining the confidence of the gesture recognition result;
and executing control operation corresponding to the gesture recognition result in response to the fact that the confidence degree of the gesture recognition result is larger than or equal to a preset confidence threshold value.
6. The display device according to claim 5, wherein in the step of obtaining the confidence level of the gesture recognition result, the controller is further configured to:
in response to the fact that the target image is the first frame image for confirming that the gesture recognition result belongs to the control gesture, acquiring multiple frames of images to be detected from the video frames collected by the camera at a second specified frequency within a specified time length after the target image is acquired; wherein the second specified frequency is greater than the first specified frequency;
acquiring the number of effective image frames containing the control gesture in the plurality of frames of images to be detected; acquiring the total image frame number of the plurality of frames of images to be detected;
and calculating the ratio of the effective image frame number to the total image frame number, and obtaining the confidence coefficient of the gesture recognition result based on the ratio.
7. The display device of claim 5, wherein the controller is further configured to:
in response to the confidence level being less than the confidence threshold, acquiring a new target image from video frames captured by the camera at the first specified frequency.
8. The display device of claim 1, wherein the controller is further configured to:
displaying identification information on the display in the process of performing gesture recognition according to the gesture information and judging whether a gesture recognition result contains a preset control gesture according to the limb information; the identification information is used to indicate that gesture recognition is in progress.
9. The display device of claim 1, wherein the controller is further configured to:
in response to the gesture recognition result not including the control gesture, performing one or more of the following: and maintaining the current running state, initiating prompt information of gesture recognition failure or acquiring a new target image acquired by the camera.
10. A control method of a display device, characterized by comprising:
acquiring a target image acquired by a camera;
in response to the detection of the gesture information and the limb information in the target image, performing gesture recognition according to the gesture information, and judging whether a gesture recognition result contains a preset control gesture or not according to the limb information;
and in response to the gesture recognition result containing the control gesture, executing a control operation corresponding to the gesture recognition result.
CN202111070745.1A 2021-09-13 2021-09-13 Display apparatus and control method of display apparatus Pending CN113918010A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111070745.1A CN113918010A (en) 2021-09-13 2021-09-13 Display apparatus and control method of display apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111070745.1A CN113918010A (en) 2021-09-13 2021-09-13 Display apparatus and control method of display apparatus

Publications (1)

Publication Number Publication Date
CN113918010A true CN113918010A (en) 2022-01-11

Family

ID=79234569

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111070745.1A Pending CN113918010A (en) 2021-09-13 2021-09-13 Display apparatus and control method of display apparatus

Country Status (1)

Country Link
CN (1) CN113918010A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114518801A (en) * 2022-02-18 2022-05-20 美的集团(上海)有限公司 Device control method, computer program product, control device, and storage medium
CN114546114A (en) * 2022-02-15 2022-05-27 美的集团(上海)有限公司 Control method and control device for mobile robot and mobile robot
CN115616928A (en) * 2022-10-21 2023-01-17 广州视声智能股份有限公司 Control panel control method and device based on artificial intelligence

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855461A (en) * 2011-07-01 2013-01-02 株式会社理光 Method and equipment for detecting fingers in images
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN105589550A (en) * 2014-10-21 2016-05-18 中兴通讯股份有限公司 Information publishing method, information receiving method, information publishing device, information receiving device and information sharing system
US20180307319A1 (en) * 2017-04-20 2018-10-25 Microsoft Technology Licensing, Llc Gesture recognition
CN112860212A (en) * 2021-02-08 2021-05-28 海信视像科技股份有限公司 Volume adjusting method and display device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855461A (en) * 2011-07-01 2013-01-02 株式会社理光 Method and equipment for detecting fingers in images
US20160078289A1 (en) * 2014-09-16 2016-03-17 Foundation for Research and Technology - Hellas (FORTH) (acting through its Institute of Computer Gesture Recognition Apparatuses, Methods and Systems for Human-Machine Interaction
CN105589550A (en) * 2014-10-21 2016-05-18 中兴通讯股份有限公司 Information publishing method, information receiving method, information publishing device, information receiving device and information sharing system
US20180307319A1 (en) * 2017-04-20 2018-10-25 Microsoft Technology Licensing, Llc Gesture recognition
CN112860212A (en) * 2021-02-08 2021-05-28 海信视像科技股份有限公司 Volume adjusting method and display device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114546114A (en) * 2022-02-15 2022-05-27 美的集团(上海)有限公司 Control method and control device for mobile robot and mobile robot
CN114518801A (en) * 2022-02-18 2022-05-20 美的集团(上海)有限公司 Device control method, computer program product, control device, and storage medium
CN114518801B (en) * 2022-02-18 2023-10-27 美的集团(上海)有限公司 Device control method, control device, and storage medium
CN115616928A (en) * 2022-10-21 2023-01-17 广州视声智能股份有限公司 Control panel control method and device based on artificial intelligence

Similar Documents

Publication Publication Date Title
CN113918010A (en) Display apparatus and control method of display apparatus
CN111277884B (en) Video playing method and device
CN111741372A (en) Screen projection method for video call, display device and terminal device
CN112087671B (en) Display method and display equipment for control prompt information of input method control
CN113784200A (en) Communication terminal, display device and screen projection connection method
CN111970549A (en) Menu display method and display device
CN112565862A (en) Display equipment and equipment parameter memorizing method and restoring method thereof
CN112203154A (en) Display device
CN113453057B (en) Display device and playing progress control method
CN111866498B (en) Camera abnormity processing method and display device
CN112040535B (en) Wifi processing method and display device
CN112817557A (en) Volume adjusting method based on multi-person gesture recognition and display device
CN113630653B (en) Display device and sound mode setting method
CN111901649B (en) Video playing method and display equipment
CN112118476B (en) Method for rapidly displaying program reservation icon and display equipment
CN113810747B (en) Display equipment and signal source setting interface interaction method
CN111918056B (en) Camera state detection method and display device
CN113794919A (en) Display equipment and setting method of sound production equipment
CN114302197A (en) Voice separation control method and display device
CN114079827A (en) Menu display method and display device
CN113782021B (en) Display equipment and prompt tone playing method
CN113194355B (en) Video playing method and display equipment
CN114040341B (en) Bluetooth broadcast packet reporting processing method and display device
CN114071056B (en) Video data display method and display device
CN117806455A (en) Display equipment and gesture control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination