WO2021031809A1 - 一种界面显示方法及显示设备 - Google Patents
一种界面显示方法及显示设备 Download PDFInfo
- Publication number
- WO2021031809A1 WO2021031809A1 PCT/CN2020/105277 CN2020105277W WO2021031809A1 WO 2021031809 A1 WO2021031809 A1 WO 2021031809A1 CN 2020105277 W CN2020105277 W CN 2020105277W WO 2021031809 A1 WO2021031809 A1 WO 2021031809A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- interface
- display
- demonstration
- local image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42204—User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4781—Games
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
Definitions
- the embodiments of the present application relate to display technology. More specifically, it relates to an interface display method and display device.
- the display device can provide users with playback screens such as audio, video, and pictures, it has received widespread attention from users. Users can accomplish some things with the help of display devices. For example, some users play a coach’s instructional video through a display device and follow the instructional video to complete corresponding actions.
- Some users play a coach’s instructional video through a display device and follow the instructional video to complete corresponding actions.
- users' demand for the functions of display devices is increasing day by day.
- the user wants to watch the coach teaching video frame he can see the video frame of the user's own action. Through the comparison of the video frame of his own action and the coach teaching video frame, he finds the insufficiency of his own action, and performs his own action in time correct.
- this application provides an interface display method and display device.
- the embodiment of the present application shows a display device, including:
- a display screen for displaying a first display interface and the second display interface the first display interface includes a playback control for controlling the playback of a demonstration video
- the second display interface includes a display for playing the demonstration video The first play window of and the second play window used to play the local image collected by the camera;
- Camera used to collect local images
- a controller configured to:
- the second interface is displayed, the demonstration video is played in the first play window, and the local image is played in the second video window.
- the embodiment of the present application shows a display device, including:
- a display screen for displaying a first display interface and the second display interface the first display interface includes a playback control for controlling the playback of a demonstration video
- the second display interface includes a display for playing the demonstration video The first play window of and the second play window used to play the local image collected by the camera;
- Camera used to collect local video
- a controller configured to:
- the current video frame or the neighboring video frame and the demonstration video frame whose marked difference degree of the human body movement is lower than the difference threshold are buffered for display of the exercise evaluation interface.
- the embodiment of the present application shows a display device, including:
- a display screen for displaying a first display interface and the second display interface the first display interface includes a playback control for controlling the playback of a demonstration video
- the second display interface includes a display for playing the demonstration video The first play window of and the second play window used to play the local image collected by the camera;
- Camera used to collect local images
- a controller configured to:
- the second interface is displayed, the demonstration video is played in the first play window, and the local image with joint points marked is played in the second video window.
- the first joint point is marked as the first color
- the second joint point is marked as the second color
- the degree of movement difference of is greater than the degree of movement difference between the body part where the second joint point is located and the corresponding body part in the demonstration video.
- the embodiment of the present application shows an interface display method, including:
- the second interface is displayed, the demonstration video is played in the first play window in the second interface, and the local image is played in the second video window in the second interface .
- the embodiment of the present application shows an interface display method, and the method includes:
- an exercise evaluation interface is displayed, wherein the exercise evaluation interface displays the current video frame or the neighboring video frame with a lower degree of difference in human body action after the annotation, and the demonstration video frame.
- the embodiment of the present application shows an interface display method, and the method includes:
- the second interface is displayed, the demonstration video is played in the first play window in the second interface, and the joint points are marked in the second video window in the second interface
- the first joint point is marked as a first color
- the second joint point is marked as a second color
- the first joint point is located
- the motion difference between the body part and the corresponding body part in the demonstration video is greater than the motion difference between the body part where the second joint point is located and the corresponding body part in the demonstration video.
- Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment
- FIG. 2 exemplarily shows a block diagram of the hardware configuration of the control device 100 according to the embodiment
- FIG. 3 exemplarily shows a block diagram of the hardware configuration of the display device 200 according to the embodiment
- FIG. 4 exemplarily shows a block diagram of the hardware architecture of the display device 200 according to FIG. 3;
- FIG. 5 exemplarily shows a schematic diagram of the functional configuration of the display device 200 according to the embodiment
- Fig. 6a exemplarily shows a schematic diagram of software configuration in the display device 200 according to the embodiment
- FIG. 6b exemplarily shows a configuration diagram of an application program in the display device 200 according to the embodiment
- FIG. 7 exemplarily shows a schematic diagram of the user interface in the display device 200 according to the embodiment.
- Fig. 8-1 is a schematic diagram showing a first interface according to some embodiments.
- Fig. 8-2 is a schematic diagram showing a first interface according to some embodiments.
- Figure 9-1 is a schematic diagram showing a prompt interface according to some embodiments.
- Figure 9-2 is a schematic diagram showing a prompt interface according to some embodiments.
- FIG. 10 is a schematic diagram showing a second display interface according to some embodiments.
- FIG. 11 is a local image with 13 joint positions annotated according to some embodiments.
- Figure 12 shows a local image with joint annotations according to some embodiments
- FIG. 13 is a color-labeled local image according to some embodiments.
- Figure 14-1 is an exercise evaluation interface according to some embodiments.
- Figure 14-2 is a schematic diagram showing a second display interface according to some embodiments.
- various external device interfaces are usually provided on the display device to facilitate the connection of different peripheral devices or cables to realize corresponding functions.
- a high-definition camera is connected to the interface of the display device, if the hardware system of the display device does not have the hardware interface of the high-pixel camera that receives the source code, then the data received by the camera cannot be presented to the display of the display device. On the screen.
- the hardware system of traditional display devices only supports one hard decoding resource, and usually only supports 4K resolution video decoding. Therefore, when you want to realize the video chat while watching Internet TV, in order not to reduce
- the definition of the network video screen requires the use of hard decoding resources (usually the GPU in the hardware system) to decode the network video.
- the general-purpose processor such as CPU
- the video chat screen is processed by soft decoding.
- this application discloses a dual hardware system architecture to realize multiple channels of video chat data (at least one local video).
- module used in some embodiments of this application can refer to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or a combination of hardware or/and software code that can execute related to the component Function.
- remote control used in some embodiments of this application refers to a component of an electronic device (such as the display device disclosed in this application), which can generally control the electronic device wirelessly within a short distance.
- This component can generally use infrared and/or radio frequency (RF) signals and/or Bluetooth to connect to electronic devices, and can also include functional modules such as WiFi, wireless USB, Bluetooth, and motion sensors.
- RF radio frequency
- a handheld touch remote control uses a user interface in a touch screen to replace most of the physical built-in hard keys in general remote control devices.
- gesture used in some embodiments of the present application refers to a user's behavior that expresses expected ideas, actions, goals, and/or results through an action such as a change of hand shape or hand movement.
- the term "hardware system” used in some embodiments of this application may refer to a computing device composed of mechanical, optical, electrical, and magnetic devices such as an integrated circuit (IC) and a printed circuit board (PCB). , Control, storage, input and output functions.
- IC integrated circuit
- PCB printed circuit board
- Control storage, input and output functions.
- the hardware system is also usually referred to as a motherboard or a chip.
- operating system used in some embodiments of the present application may refer to a computer system presented to the user after the processor reads the code in the memory, such as "Android OS”, “Mac OS”, “Windows OS” and so on.
- Fig. 1 exemplarily shows a schematic diagram of an operation scenario between a display device and a control device according to an embodiment. As shown in FIG. 1, the user can operate the display device 200 by controlling the device 100.
- the control device 100 may be a remote controller 100A, which can communicate with the display device 200 through infrared protocol communication, Bluetooth protocol communication, ZigBee protocol communication or other short-distance communication methods for wireless or other short-distance communication.
- the display device 200 is controlled in a wired manner.
- the user can control the display device 200 by inputting user instructions through keys on the remote control, voice input, control panel input, etc.
- the user can control the display device 200 by inputting corresponding control commands through the volume plus and minus keys, channel control keys, up/down/left/right movement keys, voice input keys, menu keys, and switch buttons on the remote control. Function.
- the control device 100 may also be a smart device, such as a mobile terminal 100B, a tablet computer, a computer, a notebook computer, etc., which may be connected to a local area network (LAN), a wide area network (WAN, Wide Area Network), and a wireless local area network ((WLAN) , Wireless Local Area Network) or other networks communicate with the display device 200, and realize the control of the display device 200 through an application program corresponding to the display device 200.
- LAN local area network
- WAN wide area network
- WLAN wireless local area network
- Wireless Local Area Network Wireless Local Area Network
- both the mobile terminal 100B and the display device 200 can be installed with software applications, so that the connection and communication between the two can be realized through a network communication protocol, thereby realizing one-to-one control operation and data communication.
- the mobile terminal 100B can establish a control command protocol with the display device 200, synchronize the remote control keyboard to the mobile terminal 100B, and control the display device 200 by controlling the user interface of the mobile terminal 100B; or the mobile terminal 100B
- the audio and video content displayed on the screen is transmitted to the display device 200 to realize the synchronous display function.
- the display device 200 can also communicate with the server 300 through multiple communication methods.
- the display device 200 may be allowed to communicate with the server 300 via a local area network, a wireless local area network, or other networks.
- the server 300 may provide various contents and interactions to the display device 200.
- the display device 200 transmits and receives information, interacts with an Electronic Program Guide (EPG, Electronic Program Guide), receives software program updates, or accesses a remotely stored digital media library.
- EPG Electronic Program Guide
- the server 300 may be a group or multiple groups, and may be one or more types of servers.
- the server 300 provides other network service content such as video on demand and advertising services.
- the display device 200 may be a liquid crystal display, an OLED (Organic Light Emitting Diode) display, or a projection display device; on the other hand, the display device may be a smart TV or a display system composed of a display and a set-top box.
- OLED Organic Light Emitting Diode
- the display device 200 may make some changes in performance and configuration as required.
- the display device 200 may additionally provide a smart network TV function that provides a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV) and so on. In some embodiments, the display device may not have a broadcast receiving function.
- a smart network TV function that provides a computer support function. Examples include Internet TV, Smart TV, Internet Protocol TV (IPTV) and so on.
- IPTV Internet Protocol TV
- the display device may not have a broadcast receiving function.
- the display device may be connected or provided with a camera, which is used to present the picture captured by the camera on the display interface of the display device or other display devices to realize interactive chats between users.
- the picture captured by the camera can be displayed on the display device in full screen, half screen, or in any optional area.
- the camera is connected to the monitor rear shell through a connecting plate, and is fixedly installed on the upper middle of the monitor rear shell.
- a connecting plate As an installable method, it can be fixedly installed at any position of the monitor rear shell to ensure its It is sufficient that the image capture area is not blocked by the rear shell, for example, the image capture area and the display device have the same orientation.
- the camera can be connected to the display rear shell through a connecting plate or other conceivable connectors.
- the connector is equipped with a lifting motor.
- the user wants to use the camera or has an application to use the camera
- it can be raised above the display.
- the camera is not needed, it can be embedded behind the back shell to protect the camera from damage.
- the camera used in this application may have 16 million pixels to achieve the purpose of ultra-high-definition display. In actual use, a camera with higher or lower than 16 million pixels can also be used.
- the content displayed in different application scenarios of the display device can be merged in many different ways, so as to achieve functions that cannot be achieved by traditional display devices.
- the user can video chat with at least one other user while watching a video program.
- the presentation of the video program can be used as the background picture, and the video chat window is displayed on the background picture. Visually, you can call this function "watch and chat".
- At least one video chat is performed across terminals.
- the user can video chat with at least one other user while entering the education application for learning. For example, students can realize remote interaction with teachers while learning content in educational applications. Visually, you can call this function "learning and chatting”.
- a video chat is conducted with players entering the game.
- players For example, when a player enters a game application to participate in a game, it can realize remote interaction with other players. Visually, you can call this function "watch and play”.
- the game scene and the video image are merged, and the portrait in the video image is cut out and displayed on the game image, thereby improving user experience.
- somatosensory games such as ball games, boxing games, running games, dancing games, etc.
- human body postures and movements are acquired through a camera, limb detection and tracking, and key point data detection of human bones.
- Animations are integrated in the game to realize games such as sports, dance and other scenes.
- the user can interact with at least one other user in video and voice in the K song application. Visually, you can call this function "watch and sing". Preferably, when at least one user enters the application in the chat scene, multiple users can jointly complete the recording of a song.
- the user can turn on the camera locally to obtain pictures and videos, which is vivid, and this function can be called "mirror”.
- more functions may be added or the above functions may be reduced. This application does not specifically limit the function of the display device.
- Fig. 2 exemplarily shows a configuration block diagram of the control device 100 according to an exemplary embodiment.
- the control device 100 includes a controller 110, a communicator 130, a user input/output interface 140, a memory 190, and a power supply 180.
- the control device 100 is configured to control the display device 200, and can receive user input operation instructions, and convert the operation instructions into instructions that can be recognized and responded to by the display device 200, and serve as an interactive intermediary between the user and the display device 200 effect.
- the user operates the channel addition and subtraction keys on the control device 100, and the display device 200 responds to the channel addition and subtraction operations.
- control device 100 may be a smart device.
- control device 100 can install various applications for controlling the display device 200 according to user requirements.
- the mobile terminal 100B or other smart electronic devices can perform similar functions to the control device 100 after installing an application for controlling the display device 200.
- the user can install various function keys or virtual buttons of the graphical user interface that can be provided on the mobile terminal 100B or other smart electronic devices by installing applications to realize the function of the physical keys of the control device 100.
- the controller 110 includes a processor 112, RAM 113 and ROM 114, a communication interface, and a communication bus.
- the controller 110 is used to control the operation and operation of the control device 100, as well as the communication and cooperation between internal components, and external and internal data processing functions.
- the communicator 130 realizes communication of control signals and data signals with the display device 200 under the control of the controller 110. For example, the received user input signal is sent to the display device 200.
- the communicator 130 may include at least one of communication modules such as a WIFI module 131, a Bluetooth module 132, and an NFC module 133.
- the user input/output interface 140 wherein the input interface includes at least one of input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
- input interfaces such as a microphone 141, a touch panel 142, a sensor 143, and a button 144.
- the user can implement the user instruction input function through voice, touch, gesture, pressing and other actions.
- the input interface converts the received analog signal into a digital signal and the digital signal into a corresponding instruction signal, which is sent to the display device 200.
- the output interface includes an interface for sending the received user instruction to the display device 200.
- it may be an infrared interface or a radio frequency interface.
- the user input instruction needs to be converted into an infrared control signal according to the infrared control protocol, and sent to the display device 200 via the infrared sending module.
- a radio frequency signal interface a user input instruction needs to be converted into a digital signal, and then modulated according to the radio frequency control signal modulation protocol, and then sent to the display device 200 by the radio frequency transmitting terminal.
- control device 100 includes at least one of a communicator 130 and an output interface.
- the control device 100 is configured with a communicator 130, such as: WIFI, Bluetooth, NFC and other modules, which can encode user input instructions through the WIFI protocol, or Bluetooth protocol, or NFC protocol, and send to the display device 200.
- a communicator 130 such as: WIFI, Bluetooth, NFC and other modules, which can encode user input instructions through the WIFI protocol, or Bluetooth protocol, or NFC protocol, and send to the display device 200.
- the memory 190 is used to store various operating programs, data and applications for driving and controlling the control device 100 under the control of the controller 110.
- the memory 190 can store various control signal instructions input by the user.
- the power supply 180 is used to provide operating power support for each element of the control device 100 under the control of the controller 110. Can battery and related control circuit.
- FIG. 3 exemplarily shows a hardware configuration block diagram of a hardware system in the display device 200 according to an exemplary embodiment.
- the mechanism relationship of the hardware system can be shown in Figure 3.
- one hardware system in the dual hardware system architecture is referred to as the first hardware system or A system, A chip, and the other hardware system is referred to as the second hardware system or N system, N chip.
- the A chip includes the controller and various interfaces of the A chip
- the N chip includes the controller and various interfaces of the N chip.
- a relatively independent operating system can be installed in the A chip and the N chip.
- the operating system of the A chip and the operating system of the N chip can communicate with each other through a communication protocol. Exemplary: the framework layer of the A chip's operating system and the N chip's
- the framework layer of the operating system can communicate for the transmission of commands and data, so that there are two independent but interrelated subsystems in the display device 200.
- the A chip and the N chip can realize connection, communication and power supply through multiple different types of interfaces.
- the interface type of the interface between the A chip and the N chip may include at least one of general-purpose input/output (GPIO), USB interface, HDMI interface, UART interface, and the like.
- GPIO general-purpose input/output
- USB interface USB interface
- HDMI interface HDMI interface
- UART interface UART interface
- One or more of these interfaces can be used between the A chip and the N chip for communication or power transmission.
- the N chip can be powered by an external power source
- the A chip can be powered by the N chip instead of the external power source.
- the A chip may also include interfaces for connecting other devices or components, such as the MIPI interface for connecting to a camera (Camera) shown in FIG. 3, a Bluetooth interface, etc.
- the N chip can also include a VBY interface for connecting to the display screen TCON (Timer Control Register), which is used to connect a power amplifier (Amplifier, AMP) and a speaker (Speaker). ) I2S interface; and at least one of IR/Key interface, USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
- TCON Timer Control Register
- AMP power amplifier
- Speaker speaker
- I2S interface I2S interface
- I2S interface I2S interface
- IR/Key interface at least one of IR/Key interface, USB interface, Wifi interface, Bluetooth interface, HDMI interface, Tuner interface, etc.
- FIG. 4 is only an exemplary description of the dual hardware system architecture of the present application, and does not represent a limitation to the present application. In practical applications, both hardware systems can contain more or less hardware or interfaces as required.
- FIG. 4 exemplarily shows a hardware architecture block diagram of the display device 200 according to FIG. 3.
- the hardware system of the display device 200 may include an A chip and an N chip, and modules connected to the A chip or the N chip through various interfaces.
- the N chip may include a tuner and demodulator 220, a communicator 230, an external device interface 250, a controller 210, a memory 290, a user input interface, a video processor 260-1, an audio processor 260-2, a display 280, and an audio output interface 270. At least one of the power supplies. In other embodiments, the N chip may also include more or fewer modules.
- the tuner and demodulator 220 is used to perform modulation and demodulation processing such as amplifying, mixing, and resonating broadcast television signals received through wired or wireless methods, thereby demodulating the user’s information from multiple wireless or cable broadcast television signals. Select the audio and video signals carried in the frequency of the TV channel, and additional information (such as EPG data signals).
- the signal path of the tuner and demodulator 220 can be many kinds, such as: terrestrial broadcasting, cable broadcasting, satellite broadcasting or Internet broadcasting, etc.; and according to different modulation types, the signal adjustment method can be digitally modulated The method may also be an analog modulation method; and according to different types of received television signals, the tuner demodulator 220 may demodulate analog signals and/or digital signals.
- the tuner and demodulator 220 is also used to respond to the TV channel frequency selected by the user and the TV signal carried by the frequency according to the user's selection and control by the controller 210.
- the tuner demodulator 220 may also be in an external device, such as an external set-top box.
- the set-top box outputs TV audio and video signals through modulation and demodulation, and inputs them to the display device 200 through the external device interface 250.
- the communicator 230 is a component for communicating with external devices or external servers according to various communication protocol types.
- the communicator 230 may include a WIFI module 231, a Bluetooth module 232, a wired Ethernet module 233, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
- the display device 200 may establish a control signal and a data signal connection with an external control device or content providing device through the communicator 230.
- the communicator may receive the control signal of the remote controller 100 according to the control of the controller.
- the external device interface 250 is a component that provides data transmission between the N chip controller 210 and the A chip and other external devices.
- the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc. in a wired/wireless manner, and can receive external devices such as video signals (such as moving images), audio signals (such as music), and additional information (such as EPG). ) And other data.
- the external device interface 250 may include: a high-definition multimedia interface (HDMI) terminal 251, a composite video blanking synchronization (CVBS) terminal 252, an analog or digital component terminal 253, a universal serial bus (USB) terminal 254, red, green, and blue ( RGB) terminal (not shown in the figure) and any one or more.
- HDMI high-definition multimedia interface
- CVBS composite video blanking synchronization
- USB universal serial bus
- RGB red, green, and blue
- the controller 210 controls the work of the display device 200 and responds to user operations by running various software control programs (such as an operating system and/or various application programs) stored on the memory 290.
- various software control programs such as an operating system and/or various application programs
- the controller 210 includes a read-only memory RAM 214, a random access memory ROM 213, a graphics processor 216, a CPU processor 212, a communication interface 218, and a communication bus.
- RAM213 and ROM214, graphics processor 216, CPU processor 212, and communication interface 218 are connected by a bus.
- the graphics processor 216 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive commands input by the user, and displays various objects according to display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
- the CPU processor 212 is configured to execute operating system and application program instructions stored in the memory 290. And according to receiving various interactive instructions input from the outside, to execute various applications, data and content, so as to finally display and play various audio and video content.
- the CPU processor 212 may include multiple processors.
- the multiple processors may include one main processor and multiple or one sub-processors.
- the main processor is used to perform some operations of the display device 200 in the pre-power-on mode, and/or to display images in the normal mode.
- the communication interface may include the first interface 218-1 to the nth interface 218-n. These interfaces may be network interfaces connected to external devices via a network.
- the controller 210 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
- the object may be any one of the selectable objects, such as a hyperlink or an icon.
- Operations related to the selected object for example: display operations connected to hyperlink pages, documents, images, etc., or perform operations corresponding to the icon.
- the user command for selecting the UI object may be a command input through various input devices (for example, a mouse, a keyboard, a touch pad, etc.) connected to the display device 200 or a voice command corresponding to the voice spoken by the user.
- the memory 290 includes storing various software modules for driving and controlling the display device 200.
- various software modules stored in the memory 290 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
- the basic module is the underlying software module used for signal communication between various hardware in the display device 200 and sending processing and control signals to the upper module.
- the detection module is a management module used to collect various information from various sensors or user input interfaces, and perform digital-to-analog conversion and analysis management.
- the voice recognition module includes a voice analysis module and a voice command database module.
- the display control module is a module for controlling the display 280 to display image content, and can be used to play information such as multimedia image content and UI interfaces.
- the communication module is a module used for control and data communication with external devices.
- the browser module is a module used to perform data communication between browsing servers.
- the service module is a module used to provide various services and various applications.
- the memory 290 is also used to store and receive external data and user data, images of various items in various user interfaces, and visual effect diagrams of focus objects.
- the user input interface is used to send the user's input signal to the controller 210, or to transmit the signal output from the controller to the user.
- the control device (such as a mobile terminal or a remote control) may send input signals input by the user, such as a power switch signal, a channel selection signal, and a volume adjustment signal, to the user input interface, and then the user input interface forwards the input signal to the controller;
- the control device may receive output signals such as audio, video, or data output from the user input interface processed by the controller, and display the received output signal or output the received output signal as audio or vibration.
- the user may input a user command through a graphical user interface (GUI) displayed on the display 280, and the user input interface receives the user input command through the graphical user interface (GUI).
- GUI graphical user interface
- the user can input a user command by inputting a specific sound or gesture, and the user input interface recognizes the sound or gesture through the sensor to receive the user input command.
- the video processor 260-1 is used to receive video signals, and perform video data processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, and image synthesis according to the standard codec protocol of the input signal.
- the video signal displayed or played directly on the display 280.
- the video processor 260-1 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
- the demultiplexing module is used to demultiplex the input audio and video data stream. For example, if MPEG-2 is input, the demultiplexing module will demultiplex into a video signal and an audio signal.
- the video decoding module is used to process the demultiplexed video signal, including decoding and scaling.
- An image synthesis module such as an image synthesizer, is used to superimpose and mix the GUI signal generated by the graphics generator with the zoomed video image according to user input or itself, to generate an image signal for display.
- Frame rate conversion module used to convert the frame rate of the input video, such as converting the frame rate of the input 24Hz, 25Hz, 30Hz, 60Hz video to the frame rate of 60Hz, 120Hz or 240Hz, where the input frame rate can be compared with the source
- the video stream is related, and the output frame rate can be related to the update rate of the display.
- the input has a usual format, such as frame insertion.
- the display formatting module is used to change the signal output by the frame rate conversion module into a signal that conforms to a display format such as a display, such as format conversion of the signal output by the frame rate conversion module to output RGB data signals.
- the video processor and the graphics processor can be integrated in the chip in a unified manner.
- the functional modules of the graphics processor and the video processor can be configured as required.
- the display 280 is used to receive the image signal input from the video processor 260-1, display video content and images, and a menu control interface.
- the display 280 includes a display component for presenting a picture and a driving component for driving image display.
- the displayed video content can be from the video in the broadcast signal received by the tuner and demodulator 220, or from the video content input by the communicator or the external device interface.
- the display 220 simultaneously displays a user manipulation interface UI generated in the display device 200 and used to control the display device 200.
- the display 280 it also includes a driving component for driving the display.
- the display 280 is a projection display, it may also include a projection device and a projection screen.
- the audio processor 260-2 is used to receive audio signals, and perform decompression and decoding according to the standard codec protocol of the input signal, as well as audio data processing such as noise reduction, digital-to-analog conversion, and amplification processing, and the result can be in the speaker 272 The audio signal to be played.
- the audio output interface 270 is used to receive the audio signal output by the audio processor 260-2 under the control of the controller 210.
- the audio output interface may include a speaker 272 or output to an external audio output terminal 274 of a generator of an external device, such as : External audio terminal or headphone output terminal, etc.
- the video processor 260-1 may include one or more chips.
- the audio processor 260-2 may also include one or more chips.
- the video processor 260-1 and the audio processor 260-2 may be separate chips, or they may be integrated with the controller 210 in one or more chips.
- the power supply is used to provide power supply support for the display device 200 with power input from an external power supply under the control of the controller 210.
- the power supply may include a built-in power supply circuit installed inside the display device 200, or may be a power supply installed outside the display device 200, such as a power interface that provides an external power supply in the display device 200.
- the A chip may include a controller 310, a communicator 330, a detector 340, and a memory 390. In some embodiments, it may also include a user input interface, a video processor, an audio processor, a display, and an audio output interface. In some embodiments, there may also be a power supply that independently powers the A chip.
- the communicator 330 is a component for communicating with external devices or external servers according to various communication protocol types.
- the communicator 330 may include a WIFI module 331, a Bluetooth communication protocol module 332, a wired Ethernet communication protocol module 333, and an infrared communication protocol module and other network communication protocol modules or near field communication protocol modules.
- the communicator 330 of the A chip and the communicator 230 of the N chip also interact with each other.
- the WiFi module 231 in the N chip hardware system is used to connect to an external network and generate network communication with an external server and the like.
- the WiFi module 331 in the A chip is used to connect to the WiFi module 231 in the N chip hardware system without direct connection with an external network or the like, and the A chip connects to the external network through the N chip. Therefore, for the user, a display device as in the above embodiment can externally display a WiFi account.
- the detector 340 is a component used by the chip of the display device A to collect signals from the external environment or interact with the outside.
- the detector 340 may include a light receiver 342, a sensor used to collect the intensity of ambient light, which can adaptively display parameter changes by collecting ambient light, etc.; it may also include an image collector 341, such as a camera, a camera, etc., which can be used to collect external Environmental scenes, as well as gestures used to collect user attributes or interact with users, can adaptively change display parameters, and can also recognize user gestures to achieve the function of interaction with users.
- the external device interface provides components for data transmission between the controller 310 and the N chip or other external devices.
- the external device interface can be connected to external devices such as set-top boxes, game devices, notebook computers, etc., in a wired/wireless manner.
- the controller 310 controls the work of the display device 200 and responds to user operations by running various software control programs (such as installed third-party applications, etc.) stored on the memory 390 and interacting with the N chip.
- various software control programs such as installed third-party applications, etc.
- the controller 310 includes a read-only memory ROM 313, a random access memory RAM 314, a graphics processor 316, a CPU processor 312, a communication interface 318, and a communication bus.
- the ROM 313 and the RAM 314, the graphics processor 316, the CPU processor 312, and the communication interface 318 are connected by a bus.
- the CPU processor 312 runs the system startup instruction in the ROM, and copies the temporary data of the operating system stored in the memory 390 to the RAM 314 to run or start the operating system. After the operating system is started, the CPU processor 312 copies the temporary data of the various application programs in the memory 390 to the RAM 314, and then runs or starts the various application programs.
- the CPU processor 312 is used to execute operating system and application instructions stored in the memory 390, communicate with the N chip, transmit and interact with signals, data, instructions, etc., and execute according to various interactive instructions received from external inputs Various applications, data and content, in order to finally display and play various audio and video content.
- the communication interface may include the first interface 318-1 to the nth interface 318-n. These interfaces may be network interfaces connected to external devices via a network, or network interfaces connected to the N chip via a network.
- the controller 310 may control the overall operation of the display device 200. For example, in response to receiving a user command for selecting a UI object to be displayed on the display 280, the controller 210 may perform an operation related to the object selected by the user command.
- the graphics processor 316 is used to generate various graphics objects, such as icons, operation menus, and user input instructions to display graphics. Including an arithmetic unit, which performs operations by receiving various interactive commands input by the user, and displays various objects according to display attributes. As well as including a renderer, various objects obtained based on the arithmetic unit are generated, and the rendering result is displayed on the display 280.
- Both the graphics processor 316 of the A chip and the graphics processor 216 of the N chip can generate various graphics objects. Differentily, if application 1 is installed on the A chip and application 2 is installed on the N chip, when the user is in the interface of the application 1 and the user inputs instructions in the application 1, the A chip graphics processor 316 generates a graphic object. When the user is on the interface of Application 2 and performs the user-input instructions in Application 2, the graphics processor 216 of the N chip generates the graphics object.
- Fig. 5 exemplarily shows a schematic diagram of a functional configuration of a display device according to an exemplary embodiment.
- the memory 390 of the A chip and the memory 290 of the N chip are respectively used to store the operating system, application programs, content and user data, etc., under the control of the controller 310 of the A chip and the controller 210 of the N chip. Perform system operations that drive the display device 200 and respond to various operations of the user.
- the memory 390 of the A chip and the memory 290 of the N chip may include volatile and/or nonvolatile memory.
- the memory 290 is specifically used to store the operating program that drives the controller 210 in the display device 200, and store various application programs built in the display device 200, and various application programs downloaded by the user from an external device, and application programs.
- the memory 290 is used to store system software such as an operating system (OS) kernel, middleware, and applications, and to store input video data and audio data, and other user data.
- OS operating system
- the memory 290 is specifically used to store driver programs and related data such as the video processor 260-1 and the audio processor 260-2, the display 280, the communication interface 230, the tuner and demodulator 220, and the input/output interface.
- the memory 290 may store software and/or programs.
- the software programs used to represent an operating system (OS) include, for example, a kernel, middleware, application programming interface (API), and/or application programs.
- OS operating system
- the kernel may control or manage system resources, or functions implemented by other programs (such as the middleware, API, or application program), and the kernel may provide interfaces to allow middleware and APIs, or applications to access the controller , To achieve control or management of system resources.
- the memory 290 includes a broadcast receiving module 2901, a channel control module 2902, a volume control module 2903, an image control module 2904, a display control module 2905, an audio control module 2906, an external command recognition module 2907, a communication control module 2908, and an optical receiver Module, power control module 2910, operating system 2911, and other application programs 2912, browser module, etc.
- the controller 210 executes various software programs in the memory 290, such as: broadcast and television signal reception and demodulation function, TV channel selection control function, volume selection control function, image control function, display control function, audio control function, external command Various functions such as identification function, communication control function, optical signal receiving function, power control function, software control platform supporting various functions, and browser function.
- the memory 390 includes storing various software modules for driving and controlling the display device 200.
- various software modules stored in the memory 390 include: a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules. Since the functions of the memory 390 and the memory 290 are relatively similar, please refer to the memory 290 for related parts, and will not be repeated here.
- the memory 390 includes an image control module 3904, an audio control module 2906, an external command recognition module 3907, a communication control module 3908, an optical receiving module 3909, an operating system 3911, and other application programs 3912, a browser module, and so on.
- the controller 210 executes various software programs in the memory 290, such as: image control function, display control function, audio control function, external command recognition function, communication control function, light signal receiving function, power control function, support for various Functional software control platform, and various functions such as browser functions.
- the external command recognition module 2907 of the N chip and the external command recognition module 3907 of the A chip can recognize different commands.
- the external command recognition module 3907 of the A chip may include a graphic recognition module 2907-1.
- the graphic recognition module 3907-1 stores a graphic database, and the camera receives external commands. In order to control the display device, the corresponding relationship is made with the instructions in the graphics database.
- the voice receiving device and the remote controller are connected to the N chip, the external command recognition module 2907 of the N chip may include a voice recognition module 2907-2.
- the graphics recognition module 2907-2 stores a voice database, and the voice receiving device, etc.
- the external voice commands or time correspond to the commands in the voice database to control the display device.
- a control device 100 such as a remote controller is connected to the N chip, and the key command recognition module interacts with the control device 100.
- Fig. 6a exemplarily shows a configuration block diagram of the software system in the display device 200 according to an exemplary embodiment.
- the operating system 2911 includes operating software for processing various basic system services and for implementing hardware-related tasks.
- part of the operating system kernel may include a series of software to manage the hardware resources of the display device and provide services for other programs or software codes.
- part of the operating system kernel may include one or more device drivers, and the device drivers may be a set of software codes in the operating system to help operate or control devices or hardware associated with the display device.
- the drive may contain code to manipulate video, audio, and/or other multimedia components. Examples include displays, cameras, Flash, WiFi, and audio drivers.
- the accessibility module 2911-1 is used to modify or access the application program, so as to realize the accessibility of the application program and the operability of its display content.
- the communication module 2911-2 is used to connect to other peripherals via related communication interfaces and communication networks.
- the user interface module 2911-3 is used to provide objects that display the user interface for access by various applications, and can realize user operability.
- the control application 2911-4 is used to control process management, including runtime applications.
- the event transmission system 2914 can be implemented in the operating system 2911 or in the application 2912. In some embodiments, it is implemented in the operating system 2911 on the one hand, and implemented in the application program 2912 at the same time, for monitoring various user input events, and responding to the recognition results of various events or sub-events according to various events. And implement one or more sets of pre-defined operation procedures.
- the event monitoring module 2914-1 is used to monitor input events or sub-events of the user input interface.
- Event recognition module 2914-2 used to input the definition of various events to various user input interfaces, recognize various events or sub-events, and transmit them to the processing to execute the corresponding one or more groups of processing programs .
- the event or sub-event refers to the input detected by one or more sensors in the display device 200 and the input of an external control device (such as the control device 100).
- an external control device such as the control device 100.
- various sub-events of voice input, gesture input sub-events of gesture recognition, and sub-events of remote control button command input of control devices include multiple forms, including but not limited to one or a combination of pressing up/down/left/right/, confirming keys, and pressing keys.
- non-physical buttons such as moving, pressing, and releasing.
- the interface layout management module 2913 which directly or indirectly receives various user input events or sub-events monitored by the event transmission system 2914, is used to update the layout of the user interface, including but not limited to the position of each control or sub-control in the interface, and the container
- the size, position, level, etc. of the interface are related to various execution operations.
- the application layer of the display device includes various applications that can be executed on the display device 200.
- the application layer 2912 of the N chip may include, but is not limited to, one or more applications, such as video-on-demand applications, application centers, and game applications.
- the application layer 3912 of the A chip may include, but is not limited to, one or more applications, such as a live TV application, a media center application, and so on. It should be noted that the application programs contained on the A chip and the N chip are determined according to the operating system and other designs. This application does not need to specifically limit and divide the application programs contained on the A chip and the N chip.
- Live TV applications can provide live TV through different sources.
- a live TV application may use input from cable TV, wireless broadcasting, satellite services, or other types of live TV services to provide TV signals.
- the live TV application can display the video of the live TV signal on the display device 200.
- Video-on-demand applications can provide videos from different storage sources. Unlike live TV applications, VOD provides video display from certain storage sources. For example, the video on demand can come from the server side of cloud storage, and from the local hard disk storage that contains the stored video programs.
- Media center applications can provide various multimedia content playback applications.
- the media center can provide services that are different from live TV or video on demand, and users can access various images or audio through the media center application.
- Application center can provide storage of various applications.
- the application program may be a game, an application program, or some other application program that is related to a computer system or other device but can be run on a display device.
- the application center can obtain these applications from different sources, store them in the local storage, and then run on the display device 200.
- the operating system includes a kernel, a database, a framework layer, and an application layer, and the above-mentioned applications are all applications in the application layer.
- FIG. 7 exemplarily shows a schematic diagram of a user interface in the display device 200 according to an exemplary embodiment.
- the user interface includes multiple view display areas, for example, a first view display area 201 and a play screen 202, where the play screen includes layout of one or more different items.
- the user interface also includes a selector indicating that the item is selected, and the position of the selector can be moved through user input to change the selection of different items.
- multiple view display areas can present display screens of different levels.
- the first view display area can present video chat item content
- the second view display area can present application layer item content (eg, webpage video, VOD display, application program screen, etc.).
- the presentation of different view display areas has priority differences, and the display priority of the view display areas is different between view display areas with different priorities.
- the priority of the system layer is higher than that of the application layer.
- the screen display in the view display area of the system layer is not blocked; and the application layer is enabled according to the user's choice
- the size and position of the view display area change the size and position of the view display area of the system layer will not be affected.
- the same level of display screen can also be presented.
- the selector can switch between the display area of the first view and the display area of the second view, and when the size and position of the display area of the first view change, the second view The size and position of the display area can be changed at any time.
- both the A chip and the N chip can be independently installed with Android and various APPs, so that each chip can realize a certain function, and the A chip and the N chip can cooperate to realize a certain function.
- the controller of the display device may be located in only one physical chip, and may include two chips as in the above embodiments, and the controllers in the A chip and the N chip together serve as the control module.
- the first aspect of some embodiments of the present application shows a display device, including:
- a display screen for displaying a first display interface and the second display interface the first display interface includes a playback control for controlling the playback of a demonstration video
- the second display interface includes a display for playing the demonstration video The first play window of and the second play window used to play the local image collected by the camera;
- the first interface of the display device in the fitness environment is shown in Figure 8-1.
- 8-1 is a schematic diagram of the first interface 200A, where the first interface 200A can display multiple demonstration videos in a scrolling manner, so that the user can select a target demonstration video among the multiple demonstration videos.
- the display window is used to display the demonstration video selected by the user.
- the controller may obtain and load the corresponding demonstration video source from the server based on API (Application Programming Interface, application program interface).
- Figure 8-2 is a schematic diagram of the first interface of a fitness video according to some embodiments.
- the first interface page of the fitness video can be called a detail interface, where the first interface can display more information in a scrolling manner.
- a coaching video so that the user can select a target demonstration video from multiple demonstration videos. For example: squat high leg lift, backward lunge step, four-point rear kick... The user selects a target demonstration video among multiple coaching videos.
- the play window is used to play the default training video or the last training video in the play history.
- an introduction display control On the right side of the play window is set an introduction display control, "Start training" control (ie play control ), at least one of the "favorite" controls.
- the interface also includes a training list control. In the training list control, display controls for multiple training videos are displayed.
- the demonstration video may also be obtained after verification. Specifically, it is shown that the demonstration video is downloaded and stored in advance; and then the mapping relationship between the demonstration video and the check code is established.
- a check code is generated based on the demonstration video selected by the user.
- the controller may obtain the demonstration video corresponding to the check code in the stored demonstration videos based on the check code. Since the demonstration video is pre-stored, the controller can directly retrieve the demonstration video corresponding to the check code after obtaining the check code. Retrieving the demonstration video in the above manner can avoid the stuttering problem caused by factors such as the network, and the previous acquisition can be downloaded in advance to improve the smoothness of the demonstration video.
- the camera is used to collect local images or local videos; when it is not turned on, the camera is in a hidden position to keep the edge of the display device smooth. After it is turned on, the camera rises and protrudes above the edge of the display device to avoid Obscure the display screen to obtain image data.
- the camera in response to the user's selection of the start training control, the camera is raised and started to obtain image data.
- the camera is always on, and the local video is collected in real time, and the collected local video is sent to the controller To display the user's actions on the follow-up interface. In this way, users can watch their actions in real time and demonstrate the actions of the video.
- the camera in response to the user's selection of the start training control, the camera is raised but in a standby state. Whenever the demonstration video is played to a preset time point, a local image is collected and the collected local image is sent to the controller . This reduces the pressure on the processor and maintains the local image on the display until the next point in time.
- a controller the controller is configured to: receive an input confirmation operation on the playback control, start a camera, and load the video data of the demonstration video;
- the confirmation operation may be a selection of the start training control.
- the controller is further configured to control the display to display a prompt interface 200C (and a guide interface) for instructing the user to enter a predetermined area.
- a prompt interface 200C and a guide interface
- FIG. 9-1 is a schematic diagram showing a prompt interface according to some embodiments, and the user adjusts his position according to the prompt interface.
- the controller controls the display to display the second interface. This is because the camera's collection area has edges. In order to better collect local data, the camera acquires the current image and creates a new floating layer above the layer displaying the current image during the display process.
- the character in the prompt interface 200C is located on the left side of the box area 200C1, and the user is correspondingly prompted to move to the right. If the character in the display screen is located on the right side of the box area, the user is correspondingly prompted to move to the left so that the user enters the predetermined area, which can be captured by the camera. The embodiment of the present application shows that the user is instructed to enter the predetermined area through the foregoing method.
- the prompt interface is also used to display prompt information.
- Figure 9-2 is a schematic diagram of a prompt interface according to some embodiments.
- the prompt message is "Please face Screen. Reminders such as "Keep your body independent”.
- the message that prompts the user to move can be a text displayed on the floating layer, a voice reminder, or an indication mark pointing to the best position frame.
- the controller may also directly display the second interface in response to the confirmation operation, and play the demonstration video in the first play window, and play the demo video in the second video window.
- the local image The user can adjust his position according to the image displayed in the second video window in the second interface.
- the controller may also determine the number of appearances of the location guidance interface in response to the confirmation operation, display the guidance interface when the display times of the guidance interface does not meet the preset value, and directly display when the preset value is satisfied
- the second interface and the demonstration video is played in the first play window, and the local image is played in the second video window. The user can adjust his position according to the image displayed in the second video window in the second interface.
- FIG. 10 is a schematic diagram of a second display interface 200B according to some embodiments, wherein the second display interface 200B includes a first play window 200B1 for playing the exemplary video and Play the second play window 200B2 of the local image collected by the camera.
- the demonstration video is played in the first play window, no joint points are shown in the demonstration video, and the second play serial port to play the local image data includes the controller obtaining the joints of the accompanying local image data according to the local image data.
- the position of the point is displayed in the second playback window after superimposing the local image data and the joint point mark according to the position of the joint point.
- superimposing the local image data and the joint point markers can add the joint point markers to the local image according to the positions of the joint points in the local image data, and then output in a layer to display the superimposed joint points.
- image You can also display the local image obtained by the camera in a layer, add a floating layer above the layer, and add articulated surface markers to the floating layer according to the position of the joint points, and display the two layers after being superimposed.
- the second playback window directly plays the local video collected by the camera.
- the embodiment of the present application shows a display device, which includes a display, a camera, and a controller.
- the controller is configured to, in response to the user's selection of the start training control in the display interface, acquire a demonstration video, raise and turn on the camera, which is used to collect local graphics; control the first playback window of the display to play the demonstration video,
- the second playing window of the display displays the local image.
- the technical solution shown in the embodiment of this application shows the demonstration video through the first play window, and the second play window displays the local pictures.
- the user can adjust his/her own performance in time by comparing the contents displayed in the two windows. Action to improve user experience.
- the camera is used to collect local video, which is a collection of continuous local images.
- the comparison process if the comparison is performed for each frame of image, the data processing capacity of the controller is relatively large.
- the controller can compare the local image with the demonstration video frame to generate the comparison result. After the user exercises, he can watch the comparison result through the display interface, thereby helping the user to better understand Own movement defects, so that the user can overcome movement defects in the subsequent fitness process.
- the exemplary video frame is a graphic corresponding to the local image in the exemplary video.
- the controller can control the camera to collect a local image when the demonstration video is played to a preset time point; and then compare the local image collected by the camera with the pre-stored demonstration video frame to obtain the comparison result.
- the controller when the demonstration video is played to a preset time point, the controller will control the camera to collect a local image.
- the preset time point may be the appearance of the first image in the demonstration video as a starting point, and each interval T is a preset time point until the last frame of image appears in the demonstration video.
- the preset time point may also be generated based on the content of the demonstration video, and each action node in the content of the demonstration video serves as a preset time point.
- the starting point of the first image is 3s
- the T time is 10S
- the length of the demonstration video is 53S.
- the corresponding preset time points are: 3S, 13S, 23S, 33S, 43S and 53S.
- the controller will control the camera to collect a local image when the video is released to 3S, 13S, 23S, 33S, 43S and 53S.
- tags will be added according to the preset time node, and local images will be collected when the tag starts at the playback time.
- the camera is always on, recording local video in real time, and sending the local video to the controller.
- the controller may extract the corresponding local image in the collected local video at a preset time point; then compare the extracted local image with the pre-stored demonstration video frame to obtain the comparison result.
- the specific implementation process when the demonstration video is played to a preset time point, the controller extracts one or more local images from the local video collected by the camera.
- the preset time point may be the appearance of the first image in the demonstration video as the starting point, and every interval T (ie, the time when the demonstration action occurs) is a preset time point until the last frame of image appears in the demonstration video.
- the preset time point may also be generated based on the content of the demonstration video or pre-marked, and each action node in the content of the demonstration video serves as a preset time point.
- the starting point of the first image is 3S
- the preset time points are: 3S, 16S, 23S, 45S, and 53S.
- the controller will release the video to 3S, 16S, 23S, 45S, and 53S.
- the appearance time of the demonstration action is arbitrary, and the acquisition of the image to be compared is triggered according to the tag or time point identifying the demonstration action.
- the user imitates the coaching action to produce the corresponding action.
- the technical solution shown in the embodiment of the present application shows a "delayed image acquisition method".
- the technical solutions shown in the embodiments of this application have used a large amount of experimental data statistics.
- the response time of the user during the period from the time the user receives the demonstration action to the corresponding action is 1S
- the corresponding technical solution shown in the embodiment of the present application configures the preset response time to 1S.
- the preset time points are: 3S, 13S, 23S, 33S, 43S, and 53S
- the corresponding delayed collection time points are: 4S, 14S, 24S, 24S, 44S, and 54S.
- the image frame appears for the first time in the demonstration video as the starting point. After the starting point, the 14S, 14S, 24S, 24S, 44S and 54S controllers control the cameras to collect local images respectively.
- the controller compares the local image with the demonstration video frame to generate a comparison result; wherein the demonstration video frame is an image corresponding to the local image in the demonstration video, or a standard image corresponding to the preset time point in the demonstration video frame.
- the image in the demonstration video may be an image with a logo.
- the mark may be a time mark, but is not limited to a time mark.
- the correspondence between the local image and the exemplary video frame may be determined based on the flag. For example, when the demonstration video is played to 4S, the controller controls the camera to collect the local view, the time mark corresponding to the local video is 3s, and the local video is compared with the target video whose time mark is 3S.
- the joint point data of the demonstration video frame is stored in the demonstration video, and the joint point data is preset in advance. Since there is no demand for other image frames except the demonstration video frame, the joint points may not be preset data.
- the preset reaction time is configured to be 1S.
- 1S is a statistical data.
- the user's response time is 1S, but 1S is not suitable for all users.
- the preset response time can be set according to requirements.
- Action comparison through image comparison will cause a greater processing burden.
- the technical solution shown in this embodiment of the application can only compare the local image with some "key parts" in the demonstration video.
- the specific realization process that is, the comparison of actions is completed by the comparison of joint points.
- the controller is further configured to: identify the joint points in the local image; compare the joint points in the local image with the joint points in the demonstration video.
- the controller is configured to, in response to the user selecting the start training control in the first display interface, the controller controls the camera to start, so that the camera collects local images.
- the camera transmits the captured local image to the controller.
- the controller recognizes the joint points in the local image; in some embodiments, the controller recognizes the joint points in the local image according to a preset model, and the joint points are the points corresponding to the human joints and the corresponding points of the human head ,
- the human body includes 13 joint positions.
- the controller marks 13 important bone joints in the whole body. Among them, the local image with 13 joint positions can be seen in Figure 11.
- the 13 joint positions are: left wrist, left elbow, left shoulder, thoracic cavity, waist, left knee, left ankle, head, right wrist Department, right elbow, right shoulder, right knee, right ankle.
- the human body is sometimes partially missing. At this time, only the human body part in the image can be identified.
- the controller is also used to compare the joint points in the local image with the joint points in the demonstration video/demonstration video frame, and determine the degree of difference between the human motion in the local image and the human motion in the demonstration video;
- the recognized joint points are marked in the image, and different colors are used to mark joint points with different degrees of motion difference.
- the comparison method may be to compare the position of the joint points of the human body in the local image with the relative position of the joint points of the human body in the demonstration video.
- the comparison result is obtained based on the difference in relative position. Different comparison results are marked with different colors.
- the left wrist of the human body in the local image differs from the position of the left wrist of the human body in the demonstration video by 10 standard values, and the left wrist joint points can be marked in red.
- the position of the right wrist of the human body in the local image and the right wrist of the human body in the demonstration video differs by 1 standard value, and the joint points of the right wrist can be marked in green.
- the comparison method can calculate the matching degree of two joint positions, and generate corresponding results according to the matching degree. Or according to the relative position relationship between the own joint points, the matching degree of the action is determined.
- the recognition and matching of joint points can also be replaced by other achievable means in related technologies.
- the demonstration joint positions are marked in the demonstration video in advance, and the demonstration joint positions are stored together with the demonstration video in a local data list.
- the marking process of the exemplary joint position is similar to the marking process shown in some of the foregoing embodiments, and will not be described in detail here.
- the controller compares the first angle in the local image with the corresponding standard angle to generate a comparison result.
- the first angle is the angle between the connection between each joint position and the adjacent joint position and the trunk connection in the local image
- the standard angle is the angle between the connection between each joint position and the adjacent joint position and the trunk connection in the demonstration video The angle between.
- the corresponding relationship between the first angle and the standard angle can be generated based on the time stamp.
- the local image acquisition time is 10S
- the standard angle corresponding to the first angle of the left ankle is the line between the left ankle and the adjacent joint position and the torso line in the image that appears in the 10s in the demonstration video Included angle.
- FIG. 12 is a local image with joint annotations according to some embodiments.
- the joint position adjacent to the left wrist 1A is the left elbow 1B, and the corresponding angle between the connecting line of the left wrist 1A and the left elbow 1B and the trunk is called the first angle 1a.
- the first angles corresponding to the left elbow, left shoulder, left knee, left ankle, head, right wrist, right elbow, right shoulder, right knee, and right ankle can be calculated respectively.
- the generation method of the standard angle can refer to the generation method of the first angle, which will not be repeated here.
- the controller calculates the degree of matching between the first angle and the corresponding standard angle; the degree of matching can be used to evaluate the degree of completion of the user's action.
- the technical party shown in the embodiment of the present application can calculate the difference between the position of each joint position and the standard position, thereby helping the user to complete each part and improving the user experience.
- the controller calculates the matching degree between the first angle and the corresponding standard angle, and marks the corresponding joint points according to the matching area. s color.
- the matching degree may be represented by an angular deviation, and the matching result is based on a preset standard deviation value.
- the corresponding joint position For the angle deviation greater than 15 degrees, the corresponding joint position can be marked as red; for the deviation of 10 degrees to 15 degrees, the corresponding joint position can be marked as yellow. For degree deviations below 10 degrees, the corresponding joint position can be marked in green.
- the first angle of the left wrist joint in the local image collected by 10S should be 20 degrees different from the standard angle corresponding to the 10s in the demonstration video.
- the corresponding left wrist joint can be marked in red; the local image collected by 10S should be the left ankle
- the difference between the first angle of the joint and the standard angle corresponding to 10s in the demonstration video is 12 degrees, and the corresponding left wrist joint can be marked as yellow; the first angle of the head in the local image collected by 10S corresponds to the 10s in the demonstration video
- the standard angle difference of 6 degrees, the corresponding left wrist joint can be marked in green.
- the marked local image can be seen in Figure 13.
- the technical solutions shown in the embodiments of the present application show a "range" comparison method. That is, when the demonstration video is played to the demonstration video frame, the display device obtains multiple image frames adjacent to the time point from the local video.
- the controller when the demonstration video is played to the preset time point, the controller is in the local Multiple image frames adjacent to the time point are selected from the video as the first image set, the first image set includes at least a first local image and a second local image, where the first local image is a local image corresponding to a preset time point , The second local image is a local image corresponding to a near preset time point.
- the controller calculates the degree of matching between the local image in the first image set and the exemplary video frame, and uses the comparison result of the local image with the highest degree of matching as the comparison result at the time point, and compares it with the exemplary video frame.
- the local image with the best matching degree is taken as the local image corresponding to the time point.
- the controller calculates the degree of matching between the first local image and the demonstration video frame (also referred to as the degree of difference in human motion).
- the controller In the first image set, the image with the highest degree of matching with the exemplary video frame is selected as the replacement image, and the replacement image is marked according to the comparison result of the replacement image and the exemplary video frame.
- the matching degree is 20%, and the preset matching degree (preset threshold) is 25%.
- the controller determines the first image set of the target data set; the first image set is the local images contained in the target data set in the period of 1S-13S.
- the calculation result is that the matching degree between the data corresponding to 8S and the standard angle of the wrist joint in the 10S demonstration video frame is 80%, 80% is the highest matching degree.
- the comparison result of the wrist joint corresponding to 10S is adjusted to 80%, and the wrist joint is marked with the color of 80% matching, and the controller caches the marked local video.
- the controller may control the display to display an exercise evaluation interface, the exercise evaluation interface being used to display the annotated local pictures.
- the exercise evaluation interface can display the user's rating level, user actions and standardized actions at the same time.
- the scoring level can be generated based on the matching degree between the local image and the demonstration video frame.
- the exercise evaluation interface may display multiple corresponding user actions and standardized actions in a scrolling manner.
- the order of display can be: display in order from low to high score.
- the display form of the exercise evaluation interface can be seen in Figure 14-1.
- the exercise evaluation interface is equipped with two display windows, one displays the local image corresponding to the time point in the user action, and the other displays the demonstration corresponding to the standard action. Video frame.
- the'joint point comparison process' can be placed on the server side for execution.
- the controller before the local image is played in the second video window, the controller is further configured to: identify the joint points in the local image; and send the joint points in the local image to the server ,
- the server enables the server to compare the joint points in the local image with the joint points of the demonstration video frame in the demonstration video, determine the degree of difference between the human motion in the local image and the demonstration video frame in the demonstration video, and Generate feedback information to the display device.
- the joint point recognition unit in the display device recognizes and marks the joint points of all images collected by the camera, and displays the joint points in the second playback window.
- the display device uploads the joint point data of the local image collected at this moment and/or the joint point data of the local image collected at an adjacent time to the server to determine the matching degree.
- the comparison method of the human body motion in the local image and the human body motion difference degree in the demonstration video can refer to the above-mentioned embodiment, which will not be repeated here.
- the controller is further configured to receive a feedback message sent by the server, and mark the identified joint points in the local image according to the feedback message, wherein different colors mark joint points with different degrees of motion difference.
- the technical solutions shown in the embodiments of the present application use different colors to mark the completion of each joint position. Different colors are used to distinguish the completion of each joint of the user, and different colors play a striking role. It can be seen that the solution shown in the embodiment of the present application is used to further help the user to understand the completion of various parts of the action.
- a training progress control is also provided above the second play window to show the degree of completion of the user action.
- the controller detects that the user action When the matching degree with the demonstration action frame is higher than the preset value, the completion degree value displayed in the control training progress control is increased. When it is detected that the matching degree between the user action and the demonstration action frame is lower than the preset value, the completion degree value displayed in the control training progress control remains unchanged.
- the server may process the local image corresponding to the preset time point.
- the specific implementation process may be: the controller sends the joint points in the local image to
- the server is specifically: the controller, when the playing time of the demonstration video reaches a preset time point, buffers local images collected within a predetermined time period before and after the preset time point. Identify the joint point data of the cached local image and send the identified joint point data to the server.
- the caching process of the local image can be referred to the above implementation and will not be repeated here.
- the solution shown in this embodiment sends the joint points of the local video to the server.
- the controller may be further configured to send the identified joint point data to the server, and at the same time send a preset time point to a server, so that the server can be configured according to the The preset time point determines the image frame (that is, the target image) of the exemplary video for comparison.
- the controller marks the local image and caches the marked picture of the local image and the demonstration video frame corresponding to the preset time point when the degree of difference in human motion is greater than a preset threshold. So that when the demonstration video is over, the local video with a large difference in action can be retrieved.
- the controller controls the display to display the exercise evaluation interface after the playback ends, and displays the cached and annotated local image pictures and the demonstration video corresponding to the preset time point on the exercise evaluation interface frame.
- the demonstration video frames at the preset time point and the corresponding local images are sorted according to the matching degree (or score), and after the demonstration video is played, the preset number with low matching degree (or score) is selected
- the demonstration video frame and the corresponding local image at the time point are displayed.
- the demonstration video frames at 5 time points and the corresponding local images are cached according to the matching degree, and after the demonstration video is played, the demonstration video frames at 3 time points with a low matching degree (or score) and The corresponding local image is displayed.
- the embodiment of this application also shows a display device, including:
- a display screen for displaying a first display interface and the second display interface the first display interface includes a playback control for controlling the playback of a demonstration video
- the second display interface includes a display for playing the demonstration video The first play window of and the second play window used to play the local image collected by the camera;
- Camera used to collect local images
- a controller configured to:
- the current video frame or the neighboring video frame and the demonstration video frame whose marked difference degree of the human body movement is lower than the difference threshold are buffered for display of the exercise evaluation interface.
- the embodiment of the present application also shows a display device, including:
- a display screen for displaying a first display interface and the second display interface the first display interface includes a playback control for controlling the playback of a demonstration video
- the second display interface includes a display for playing the demonstration video The first play window of and the second play window used to play the local image collected by the camera;
- Camera used to collect local images
- a controller configured to:
- the second interface is displayed, the demonstration video is played in the first play window, and the local image with joint points marked is played in the second video window.
- the first joint point is marked as the first color
- the second joint point is marked as the second color
- the degree of movement difference of is greater than the degree of movement difference between the body part where the second joint point is located and the corresponding body part in the demonstration video.
- the embodiment of the present application also shows an interface display method, including:
- the second interface is displayed, the demonstration video is played in the first play window in the second interface, and the local image is played in the second video window in the second interface .
- the embodiment of the present application also shows an interface display method, including:
- the current video frame of the collected local video and the neighboring video frames ( The video frame may also be called an image in the solution shown in the embodiment of the present application);
- an exercise evaluation interface is displayed, wherein the exercise evaluation interface displays the current video frame or the neighboring video frame with a lower degree of difference in human body action after the annotation, and the demonstration video frame.
- the embodiment of the present application also shows an interface display method, including:
- the second interface is displayed, the demonstration video is played in the first play window in the second interface, and the joint points are marked in the second video window in the second interface
- the first joint point is marked as a first color
- the second joint point is marked as a second color
- the first joint point is located
- the motion difference between the body part and the corresponding body part in the demonstration video is greater than the motion difference between the body part where the second joint point is located and the corresponding body part in the demonstration video.
- the method provided in this embodiment may be applicable to the dual-chip display device provided in the foregoing embodiments, and may also be applicable to non-dual-chip display devices
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- General Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Databases & Information Systems (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (12)
- 一种显示设备,其特征在于,包括:显示屏,用于显示第一显示界面和所述第二显示界面,所述第一显示界面包括用于控制播放示范视频的播放控件,所述第二显示界面中包含用于播放所述示范视频的第一播放窗口和用于播放摄像头采集到的本地图像的第二播放窗口;摄像头,用于采集本地图像;控制器,所述控制器被配置为:接收输入的对所述播放控件的确认操作,启动摄像头,并加载所述示范视频的视频数据;响应于所述确认操作,展示所述第二界面,并在所述第一播放窗口播放所述示范视频,在所述第二视频窗口播放所述本地图像。
- 如权利要求1所述的显示设备,其特征在于,在所述第二视频窗口播放所述本地图像之前,所述控制器还用于:识别所述本地图像中的关节点;比对所述本地图像中的关节点和示范视频中的关节点,确定本地图像中人体动作和示范视频中人体动作差异程度;在采集到的本地图像中标注识别出的关节点,其中,不同的颜色标注不同动作差异程度的关节点。
- 如权利要求1所述的显示设备,其特征在于,在所述第二视频窗口播放所述本地图像之前,所述控制器还用于:识别所述本地图像中的关节点;将所述本地图像中的关节点发送给服务器,所述服务器用于比对所述本地图像中的关节点和示范视频中的关节点,确定本地图像中人体动作和示范视频中人体动作差异程度并生成反馈信息给所述显示设备;接收服务器发送的反馈消息,根据所述反馈消息在本地图像中标注识别出的关节点,其中,不同的颜色标注不同动作差异程度的关节点。
- 如权利要求3所述的显示设备,其特征在于;所述控制器将所述本地图像中的关节点发送给服务器,具体为:所述控制器,在所述示范视频的播放时间达到预设时间点时,缓存所述预设时间点前后预定时间段内采集到的本地图像;识别缓存的本地图像的关节点数据并将识别出的关节点数据发送给服务器。
- 如权利要求4所述的显示设备,其特征在于;所述控制器,发送所述识别出的关节点数据发送给服务器的同时,将预设时间点发送个服务器,以使的所述服务器根据所述预设时间点确定用于比对的示范视频的图像帧。
- 如权利要求2或3所述的显示设备,其特征在于;所述控制器,在人体动作差异程度大于预设阈值的时候,标注所述本地图像并缓存标注后的本地图像的图片和预设时间点对应的示范视频帧。
- 如权利要求6所述的显示设备,其特征在于;所述控制器,在播放结束后,控制显示器显示练习评价界面,并在所述练习评价界面展示缓存的标 注后的本地图像的图片和预设时间点对应的示范视频帧。
- 一种显示设备,其特征在于,包括:显示屏,用于显示第一显示界面和所述第二显示界面,所述第一显示界面包括用于控制播放示范视频的播放控件,所述第二显示界面中包含用于播放所述示范视频的第一播放窗口和用于播放摄像头采集到的本地图像的第二播放窗口;摄像头,用于采集本地视频;控制器,所述控制器被配置为:接收输入的对所述播放控件的确认操作,启动摄像头,并加载所述示范视频的视频数据;响应于所述确认操作,展示所述第二界面;在示范视频播放过程中,检测到表征所述示范视频的播放时间达到预设时间点的标签时,截取采集到的本地视频的当前视频帧及时间上邻近所述当前视频帧的邻近视频帧;识别所述当前视频帧的关节点与所述邻近视频帧的关节点;比对所述当前视频帧的关节点和所述示范视频中和所述预设时间点对应的示范视频帧的关节点;比对所述邻近视频帧的关节点和所述示范视频中和所述预设时间点对应的示范视频帧的关节点;根据比对结果标注所述前视频帧或邻近视频帧的人体动作差异程度;缓存标注后的人体动作差异程度低于差异阈值的所述当前视频帧或所述邻近视频帧,以及所述示范视频帧,以用于练习评价界面的展示。
- 一种显示设备,其特征在于,包括:显示屏,用于显示第一显示界面和所述第二显示界面,所述第一显示界面包括用于控制播放示范视频的播放控件,所述第二显示界面中包含用于播放所述示范视频的第一播放窗口和用于播放摄像头采集到的本地图像的第二播放窗口;摄像头,用于采集本地图像;控制器,所述控制器被配置为:接收输入的对所述播放控件的确认操作,启动摄像头,并加载所述示范视频的视频数据;响应于所述确认操作,展示所述第二界面,并在所述第一播放窗口播放所述示范视频,在所述第二视频窗口播放进行关节点标注后的所述本地图像,其中在所述关节点标注后的所述本地图像中,第一关节点被标注为第一颜色,第二关节点被标注为第二颜色,第一关节点所在的身体部位和示范视频中对应的身体部分的动作差异程度大于第二关节点所在的身体部位和示范视频中对应的身体部分的动作差异程度。
- 一种界面显示方法,其特征在于,包括:在显示第一界面时,接收输入的对所述第一界面中的播放控件的确认操作,启动摄像头,并加载所述示范视频的视频数据;响应于所述确认操作,显示所述第二界面,并在所述第二界面中的第一播放窗口播放所述示范视频,在所述第二界面中的第二视频窗口播放所述本地图像。
- 一种界面显示方法,其特征在于,包括:在显示第一界面时,接收输入的对所述播放控件的确认操作,启动摄像头以使得所述摄像头采集本地视频,并加载所述示范视频的视频数据;响应于所述确认操作,展示所述第二界面;在示范视频播放过程中,检测到表征所述示范视频的播放时间达到预设时间点的标签时,截取采集到的本地视频的当前视频帧及时间上邻近所述当前视频帧的邻近视频帧;识别所述当前视频帧的关节点与所述邻近视频帧的关节点;比对所述当前视频帧的关节点和所述示范视频中和所述预设时间点对应的示范视频帧的关节点;比对所述邻近视频帧的关节点和所述示范视频中和所述预设时间点对应的示范视频帧的关节点;根据比对结果标注所述前视频帧或邻近视频帧的人体动作差异程度;缓存标注后的人体动作差异程度低于差异阈值的所述当前视频帧或所述邻近视频帧,以及所述示范视频帧;响应于所述示范视频播放结束,显示练习评价界面,其中所述练习评价界面显示所述标注后的人体动作差异程度较低的所述当前视频帧或所述邻近视频帧,以及所述示范视频帧。
- 一种界面显示方法,其特征在于,包括:在显示第一界面时,接收输入的对所述第一界面中的播放控件的确认操作,启动摄像头,并加载所述示范视频的视频数据;响应于所述确认操作,展示所述第二界面,并在所述第二界面中的第一播放窗口播放所述示范视频,在所述第二界面中的第二视频窗口播放进行关节点标注后的所述本地图像,其中在所述关节点标注后的所述本地图像中,第一关节点被标注为第一颜色,第二关节点被标注为第二颜色,第一关节点所在的身体部位和示范视频中对应的身体部分的动作差异程度大于第二关节点所在的身体部位和示范视频中对应的身体部分的动作差异程度。
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910761455 | 2019-08-18 | ||
CN201910761455.8 | 2019-08-18 | ||
CN202010386547.5A CN112399234B (zh) | 2019-08-18 | 2020-05-09 | 一种界面显示方法及显示设备 |
CN202010386547.5 | 2020-05-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021031809A1 true WO2021031809A1 (zh) | 2021-02-25 |
Family
ID=74603787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/105277 WO2021031809A1 (zh) | 2019-08-18 | 2020-07-28 | 一种界面显示方法及显示设备 |
Country Status (2)
Country | Link |
---|---|
CN (2) | CN116074564B (zh) |
WO (1) | WO2021031809A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114513694A (zh) * | 2022-02-17 | 2022-05-17 | 平安国际智慧城市科技股份有限公司 | 评分确定方法、装置、电子设备和存储介质 |
CN115883748A (zh) * | 2022-11-28 | 2023-03-31 | 中汽创智科技有限公司 | 一种数据回放的同步方法、装置、电子设备及存储介质 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113051432B (zh) * | 2021-04-25 | 2022-07-19 | 聚好看科技股份有限公司 | 显示设备及媒资播放方法 |
CN112272324B (zh) * | 2020-10-15 | 2023-03-14 | 聚好看科技股份有限公司 | 一种跟练模式控制方法及显示设备 |
CN114296668B (zh) * | 2021-03-11 | 2024-08-23 | 海信视像科技股份有限公司 | 一种显示设备 |
CN113099308B (zh) * | 2021-03-31 | 2023-10-27 | 聚好看科技股份有限公司 | 一种内容显示方法、显示设备及图像采集器 |
CN113556599A (zh) * | 2021-07-07 | 2021-10-26 | 深圳创维-Rgb电子有限公司 | 视频教学方法、装置、电视机及存储介质 |
CN113794917A (zh) * | 2021-09-15 | 2021-12-14 | 海信视像科技股份有限公司 | 一种显示设备和显示控制方法 |
CN115202531A (zh) * | 2022-05-27 | 2022-10-18 | 当趣网络科技(杭州)有限公司 | 界面交互的方法、系统和电子装置 |
WO2023240973A1 (zh) * | 2022-06-16 | 2023-12-21 | 聚好看科技股份有限公司 | 显示设备及投屏方法 |
CN116320583A (zh) * | 2023-03-20 | 2023-06-23 | 抖音视界有限公司 | 视频通话方法、装置、电子设备及存储介质 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9440134B2 (en) * | 2010-12-06 | 2016-09-13 | Full-Swing Golf, Inc. | Microsoft kinect |
CN107909060A (zh) * | 2017-12-05 | 2018-04-13 | 前海健匠智能科技(深圳)有限公司 | 基于深度学习的健身房健身动作识别方法及装置 |
CN108853946A (zh) * | 2018-07-10 | 2018-11-23 | 燕山大学 | 一种基于Kinect的健身指导训练系统及方法 |
CN109214231A (zh) * | 2017-06-29 | 2019-01-15 | 深圳泰山体育科技股份有限公司 | 基于人体姿态识别的体育教学辅助系统和方法 |
Family Cites Families (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8620146B1 (en) * | 2008-03-28 | 2013-12-31 | Theresa Coleman | Picture-in-picture video system for virtual exercise, instruction and entertainment |
US9011293B2 (en) * | 2011-01-26 | 2015-04-21 | Flow-Motion Research And Development Ltd. | Method and system for monitoring and feed-backing on execution of physical exercise routines |
JP2013103010A (ja) * | 2011-11-15 | 2013-05-30 | Sony Corp | 画像処理装置、画像処理方法及びプログラム |
WO2014162787A1 (ja) * | 2013-04-02 | 2014-10-09 | Necソリューションイノベータ株式会社 | 身体動作採点装置、ダンス採点装置、カラオケ装置及びゲーム装置 |
CN105898133A (zh) * | 2015-08-19 | 2016-08-24 | 乐视网信息技术(北京)股份有限公司 | 一种视频拍摄方法及装置 |
CN106919890A (zh) * | 2015-12-25 | 2017-07-04 | 中国移动通信集团公司 | 一种评价用户动作标准性的方法及装置 |
CN106131611A (zh) * | 2016-06-29 | 2016-11-16 | 乐视控股(北京)有限公司 | 一种识别节目频道的方法、终端及服务器 |
CN106228143A (zh) * | 2016-08-02 | 2016-12-14 | 王国兴 | 一种教学视频与摄像头视频运动对比评分的方法 |
CN108960002A (zh) * | 2017-05-17 | 2018-12-07 | 中兴通讯股份有限公司 | 一种动作调整信息提示方法及装置 |
CN107968921B (zh) * | 2017-11-23 | 2020-02-28 | 香港乐蜜有限公司 | 视频生成方法、装置和电子设备 |
CN107952238B (zh) * | 2017-11-23 | 2020-11-17 | 香港乐蜜有限公司 | 视频生成方法、装置和电子设备 |
CN107943291B (zh) * | 2017-11-23 | 2021-06-08 | 卓米私人有限公司 | 人体动作的识别方法、装置和电子设备 |
CN108537284A (zh) * | 2018-04-13 | 2018-09-14 | 东莞松山湖国际机器人研究院有限公司 | 基于计算机视觉深度学习算法的姿态评估打分方法和系统 |
CN109144247A (zh) * | 2018-07-17 | 2019-01-04 | 尚晟 | 视频交互的方法以及基于可交互视频的运动辅助系统 |
CN109068081A (zh) * | 2018-08-10 | 2018-12-21 | 北京微播视界科技有限公司 | 视频生成方法、装置、电子设备及存储介质 |
CN109432753B (zh) * | 2018-09-26 | 2020-12-29 | Oppo广东移动通信有限公司 | 动作矫正方法、装置、存储介质及电子设备 |
CN109376705A (zh) * | 2018-11-30 | 2019-02-22 | 努比亚技术有限公司 | 舞蹈训练评分方法、装置及计算机可读存储介质 |
CN109451178B (zh) * | 2018-12-27 | 2021-03-12 | 维沃移动通信有限公司 | 视频播放方法及终端 |
CN110008814A (zh) * | 2019-01-25 | 2019-07-12 | 阿里巴巴集团控股有限公司 | 视频处理方法、视频处理装置和电子设备 |
CN110087106A (zh) * | 2019-04-18 | 2019-08-02 | 杰哈思文化创意(杭州)有限公司 | 刷牙视频的播放方法、装置、存储介质和终端 |
-
2020
- 2020-05-09 CN CN202310079527.7A patent/CN116074564B/zh active Active
- 2020-05-09 CN CN202010386547.5A patent/CN112399234B/zh active Active
- 2020-07-28 WO PCT/CN2020/105277 patent/WO2021031809A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9440134B2 (en) * | 2010-12-06 | 2016-09-13 | Full-Swing Golf, Inc. | Microsoft kinect |
CN109214231A (zh) * | 2017-06-29 | 2019-01-15 | 深圳泰山体育科技股份有限公司 | 基于人体姿态识别的体育教学辅助系统和方法 |
CN107909060A (zh) * | 2017-12-05 | 2018-04-13 | 前海健匠智能科技(深圳)有限公司 | 基于深度学习的健身房健身动作识别方法及装置 |
CN108853946A (zh) * | 2018-07-10 | 2018-11-23 | 燕山大学 | 一种基于Kinect的健身指导训练系统及方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114513694A (zh) * | 2022-02-17 | 2022-05-17 | 平安国际智慧城市科技股份有限公司 | 评分确定方法、装置、电子设备和存储介质 |
CN115883748A (zh) * | 2022-11-28 | 2023-03-31 | 中汽创智科技有限公司 | 一种数据回放的同步方法、装置、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN116074564A (zh) | 2023-05-05 |
CN112399234B (zh) | 2022-12-16 |
CN116074564B (zh) | 2024-10-01 |
CN112399234A (zh) | 2021-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021031809A1 (zh) | 一种界面显示方法及显示设备 | |
CN113330736B (zh) | 一种显示器及图像处理方法 | |
WO2021088320A1 (zh) | 显示设备和内容显示方法 | |
WO2021032092A1 (zh) | 显示设备 | |
CN110708581B (zh) | 显示设备及呈现多媒体屏保信息的方法 | |
WO2021189358A1 (zh) | 显示设备和音量调节方法 | |
CN111464840B (zh) | 显示设备及显示设备屏幕亮度的调节方法 | |
CN112068741B (zh) | 显示设备及显示设备蓝牙开关状态的显示方法 | |
CN112333499A (zh) | 寻找目标设备的方法和显示设备 | |
CN112788422A (zh) | 显示设备 | |
CN112073662A (zh) | 一种显示设备 | |
WO2021031598A1 (zh) | 视频聊天窗口位置的自适应调整方法及显示设备 | |
CN112463267B (zh) | 在显示设备屏幕上呈现屏保信息的方法及显示设备 | |
WO2020248699A1 (zh) | 一种声音处理法及显示设备 | |
WO2020248654A1 (zh) | 显示设备及应用共同显示的方法 | |
CN112839254A (zh) | 显示设备与内容显示方法 | |
CN112073666B (zh) | 一种显示设备的电源控制方法及显示设备 | |
CN112073777B (zh) | 一种语音交互方法及显示设备 | |
CN112399245A (zh) | 一种播放方法及显示设备 | |
CN113301404A (zh) | 显示设备和控制方法 | |
CN112073773A (zh) | 一种屏幕互动方法、装置及显示设备 | |
WO2020248788A1 (zh) | 一种语音控制方法及显示设备 | |
CN112073779B (zh) | 显示设备及按键传递的容错方法 | |
CN112995762B (zh) | 显示设备及网络状态同步方法 | |
CN112995113B (zh) | 一种显示设备、端口控制方法及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20855353 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20855353 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 16/09/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20855353 Country of ref document: EP Kind code of ref document: A1 |