CN116033227A - Video playing method, device, electronic equipment and readable storage medium - Google Patents

Video playing method, device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN116033227A
CN116033227A CN202211691362.0A CN202211691362A CN116033227A CN 116033227 A CN116033227 A CN 116033227A CN 202211691362 A CN202211691362 A CN 202211691362A CN 116033227 A CN116033227 A CN 116033227A
Authority
CN
China
Prior art keywords
video
input
playing interface
video playing
interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211691362.0A
Other languages
Chinese (zh)
Inventor
王晓雷
刘杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202211691362.0A priority Critical patent/CN116033227A/en
Publication of CN116033227A publication Critical patent/CN116033227A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a video playing method, a video playing device, electronic equipment and a readable storage medium, and belongs to the technical field of display. The embodiment of the application provides a video playing method, which is applied to a first device, wherein the first device is in wireless communication connection with a second device, and the video playing method comprises the following steps: displaying a video playing interface on the first device; receiving a first input; and under the condition that the first input is the input of switching from the video playing interface to other interfaces, responding to the first input, displaying the other interfaces on the first device, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device.

Description

Video playing method, device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of display, and particularly relates to a video playing method, a video playing device, electronic equipment and a readable storage medium.
Background
With the development of mobile terminals such as mobile phones and tablet computers and the popularization of application software, daily office work and life entertainment of users become more convenient. Users can conduct office work and entertainment on the mobile terminal through various application software with rich functions.
In the prior art, when a user views a video on a mobile terminal, if the user needs to operate on other application software, the user can only be forced to exit from video playing, and the interface is switched to other application interfaces to perform corresponding operations. Exiting the way that video playback switches other interfaces can prevent the user from watching the video at the same time. In the prior art, when the problem is solved, the technical scheme for playing the video through the floating small window is adopted, so that on one hand, the display area of the small window is small, the watching of a user is influenced, and on the other hand, the floating window can also partially shield the display interface, and the efficiency of the user in processing transactions on other interfaces is influenced.
Disclosure of Invention
An object of the embodiments of the present application is to provide a video playing method, apparatus, electronic device, and readable storage medium, which can solve the problem that a user cannot watch video at the same time when switching to other application software for operation.
In a first aspect, an embodiment of the present application provides a video playing method, which is applied to a first device, where the first device is connected to a second device in a wireless communication manner, and the video playing method includes:
displaying a video playing interface on the first device;
Receiving a first input;
and under the condition that the first input is the input of switching from the video playing interface to other interfaces, responding to the first input, displaying the other interfaces on the first device, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device.
Optionally, the method further comprises:
if the first input is detection result information sent by the second device, closing or suspending the video playing interface in response to the first input, and sending video data of video played by the video playing interface to the second device so as to play the video on the second device;
the detection result information is used for representing that the second device detects that the video playing interface leaves the visual field range of the user.
Optionally, the method further comprises:
receiving a second input;
outputting wearing reminding information in response to the second input;
the second input is wearing state information sent by the second device when the second device is detected to be in an unworn state, and the wearing reminding information is used for prompting a user to wear the second device so as to simultaneously watch a playing interface when the second device plays the video and the other interfaces displayed by the first device through the second device.
Optionally, in the case that the first input is an input for switching from the video playing interface to another interface, the playing interface when the second device plays the video is not overlapped with the other interface;
and under the condition that the first input is the detection result information sent by the second equipment, a playing interface when the second equipment plays the video is not overlapped with a user operation area.
Optionally, before the sending, to the second device, the video data of the video played by the video playing interface, the method further includes:
sending a control instruction to the second equipment; the control instruction is used for indicating the second equipment to establish a virtual layer at a preset position with a specified size so as to display video frames included in the video data on the virtual layer.
In a second aspect, an embodiment of the present application provides another video playing method, which is applied to a second device, where the second device is connected to the first device in a wireless communication manner, and the video playing method includes:
receiving video data sent by the first equipment; the video data is the video data of the video played by the video playing interface sent to the second device, wherein the first device is used for displaying other interfaces when the first device receives a first input under the condition that the video playing interface is displayed, and responds to the first input when the first input is the input of switching from the video playing interface to the other interfaces;
Playing the video based on the video data.
Optionally, the first device is further configured to close or pause the video playing interface in response to the first input if the first input is detection result information sent by the second device, and the method further includes:
acquiring an image in a user field of view based on a camera of the second device;
and under the condition that the video playing interface is not detected from the image, determining that the video playing interface leaves the visual field range of the user, and sending the detection result information to the first device.
Optionally, the first device is further configured to receive a second input, and output wearing reminding information in response to the second input; the wearing reminding information is used for reminding a user to wear the second device so as to watch a playing interface when the second device plays the video and the other interfaces displayed on the first device through the second device at the same time, and the method further comprises the following steps:
detecting whether the second device is currently in a wearing state;
Transmitting wearing state information to the first device when the second device is currently in an unworn state; the first device is also for receiving a second input; outputting wearing reminding information in response to the second input; the wearing reminding information is used for reminding a user to wear the second device so as to watch the playing interface when the second device plays the video and the other interfaces displayed on the first device through the second device at the same time.
Optionally, in the case that the first input is an input for switching from the video playing interface to another interface, the method further includes:
under the condition that the coincidence of a playing interface and the other interfaces when the video is played is detected, adjusting the display position of the playing interface so that the playing interface is not coincident with the other interfaces;
in the case that the first input is detection result information sent by the second device, the method further includes:
detecting a current user operation area of the user based on a camera of the second device;
and under the condition that the playing interface and the user operation area are overlapped when the video is played, adjusting the display position of the playing interface so that the playing interface and the user operation area are not overlapped.
Optionally, the playing the video based on the video data includes:
and establishing a virtual layer at a preset position in a specified size, and displaying video frames included in the video data on the virtual layer.
In a third aspect, an embodiment of the present application provides a video playing apparatus, which is applied to a first device, where the first device is connected to a second device in a wireless communication manner, and the video playing apparatus includes:
the first playing module is used for displaying a video playing interface on the first equipment;
a first receiving module for receiving a first input;
and the first execution module is used for responding to the first input, displaying other interfaces on the first equipment and sending video data of the video played by the video playing interface to the second equipment so as to play the video on the second equipment when the first input is the input for switching from the video playing interface to the other interfaces.
In a fourth aspect, an embodiment of the present application provides another video playing apparatus, applied to a second device, where the second device is connected to the first device in a wireless communication manner, and the video playing apparatus includes:
the receiving module is used for receiving the video data sent by the first equipment; the video data is the video data of the video played by the video playing interface sent to the second device, wherein the first device is used for displaying other interfaces when the first device receives a first input under the condition that the video playing interface is displayed, and responds to the first input when the first input is the input of switching from the video playing interface to the other interfaces;
And the playing module is used for playing the video based on the video data.
In a fifth aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the video playing method according to the first or second aspect.
In a sixth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the video playing method according to the first or second aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the video playing method according to the first aspect or the second aspect.
In an eighth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the video playback method of the first or second aspects.
In the embodiment of the application, a video playing interface is displayed on the first device; receiving a first input; and under the condition that the first input is the input of switching from the video playing interface to other interfaces, responding to the first input, displaying the other interfaces on the first device, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device. Therefore, the user can conveniently watch the video played by the second device under the condition of operating other interfaces on the first device. The display area is not reduced without playing through the suspended small window, so that the display interface is convenient for a user to watch, partial shielding of the display interface is avoided, and the efficiency of processing transactions at other interfaces by the user is improved.
Drawings
FIG. 1 is a schematic diagram showing the AR technology of the prior art;
fig. 2 is a schematic diagram illustrating steps of a video playing method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a floating window play video in the prior art;
FIG. 4 is a schematic diagram of a user viewing a video playback interface and other interfaces through a second device according to an embodiment of the present application;
Fig. 5 is a schematic diagram illustrating steps of another video playing method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a gesture motion one according to an embodiment of the present application;
FIG. 7 is a schematic diagram of gesture motion two according to an embodiment of the present application;
fig. 8 is a schematic flow chart of switching the corresponding video playing interface to other interfaces according to the embodiment of the present application;
FIG. 9 is a flow chart of a corresponding video playback interface leaving a user's field of view according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a video playing device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of another video playing device according to the embodiment of the present application;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged, as appropriate, such that embodiments of the present application may be implemented in sequences other than those illustrated or described herein, and that the objects identified by "first," "second," etc. are generally of a type and not limited to the number of objects, e.g., the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The video playing method provided by the embodiment of the application is described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
The augmented reality (Augmented Reality, AR) technology is a new technology of integrating real world information and virtual world information in a seamless manner, and is to apply virtual information to the real world and perceived by human senses through simulation and superposition of physical information such as visual information, sound, taste, touch and the like which are difficult to experience in a certain time-space range of the real world by scientific technology such as a computer and the like, so as to achieve sensory experience exceeding reality. The real environment and the virtual object are superimposed on the same picture or space in real time and exist at the same time, and the seen scene and the person are partially real and partially virtual, so that the AR technology can bring virtual information into the real world.
The AR technology is rapidly developed in the field of consumer electronics, the requirements of users for AR intelligent terminal products are increased, the AR technology can play an important role in application scenes such as games, and innovative experience is brought to the users.
As shown in fig. 1, the micro-projection system 101 of the AR glasses 10 projects virtual information such as characters and images onto the optical display 102, and then sends the virtual information into the human eye through reflection and total reflection, while real information in the real world can directly enter the human eye through the optical display 102, so that a user can see the virtual and real 'overlapping', thereby realizing augmented reality.
The embodiment of the application provides a video playing method, which is applied to a first device, wherein the first device is in wireless communication connection with a second device, as shown in fig. 2, and the video playing method comprises the following steps:
step S1, a video playing interface is displayed on the first device.
Optionally, the first device may be a mobile communication device such as a mobile phone or a tablet computer, and the video playing interface may be a screen displayed when the video application on the first device plays the video. The second device may be an electronic device to which AR technology is applied, such as AR glasses, an in-vehicle AR device, or the like. This is by way of example only, and the embodiments of the present application are not limited thereto.
Alternatively, the first device and the second device may be pre-established wireless communication connections, and in particular, the first device may send a connection instruction to the second device to instruct the second device to establish a connection with the first device. For example, the first device sends a bluetooth connection instruction to the second device to instruct the second device to establish a bluetooth connection with the first device.
Step S2, a first input is received.
In this embodiment of the present application, the first input may be an operation performed by the user on the first device that is playing the video, so that the video playing interface of the video is changed. For example, the first input may be an operation of the user exiting the video playing interface, which is not limited in the embodiments of the present application. Specifically, the first input may be a touch screen operation such as a click input or a slide input, or may be a control operation such as a voice input or a gesture input.
In this embodiment of the present application, the first device may receive, through the touch screen, a touch screen operation of a user, or may collect, through the camera, a gesture of the user or receive, through the audio device, a voice input of the user, which is not limited in this embodiment of the present application.
And step S3, when the first input is the input of switching from the video playing interface to other interfaces, responding to the first input, displaying the other interfaces on the first device, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device.
In this embodiment of the present invention, the first device may include an interface switching control, where the first input is an input for switching from the video playing interface to another interface, for example, a triggering operation of the interface switching control by a user, and the first device switches the video playing interface to the other interface in response to the triggering operation of the interface switching control by the user, and displays the other interface on the first device.
Alternatively, other interfaces may be chat interfaces of social software, mailbox interfaces of office software, information display interfaces of weather, clock, calendar, etc., interfaces of short video application software, etc. This is by way of example only, and the embodiments of the present application are not limited thereto.
Alternatively, the video playing interface is switched to the other interface, so that the user changes the current video playing process to the background process, and changes the other interface to the current process. Or, the user changes the size of the video playing interface through the split screen or the floating window, and opens other interfaces at the same time, so that the first device simultaneously displays the new video playing interface and the other interfaces, which is not limited in the embodiment of the present application. For example, as shown in fig. 3, in the prior art, when a user needs to operate on a chat interface 201 of social Application software (APP), the playing video continues to be played through a floating window 202, which in this embodiment of the present Application may also be considered as a video playing interface being switched to another interface. It should be noted that, the user changes the size of the video playing interface through the split screen or the floating window, and compared with the video playing window before switching other interfaces, the video playing effect is poor.
In this embodiment, the first device may send the video data of the video played by the video playing interface to the second device through a wireless communication connection, such as bluetooth, so that the second device plays the video based on the video data.
Alternatively, the video data sent by the first device may be all video data of the video, or when the video playing interface is switched to another interface, a part of the video data that is not completely played is not limited in this embodiment of the present application.
In the embodiment of the application, a video playing interface is displayed on a first device; receiving a first input; and under the condition that the first input is the input of switching from the video playing interface to other interfaces, responding to the first input, displaying the other interfaces on the first device, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device. Therefore, the user can conveniently watch the video played by the second device under the condition of operating other interfaces on the first device. The display area is not reduced without playing through the suspended small window, so that the display interface is convenient for a user to watch, partial shielding of the display interface is avoided, and the efficiency of processing transactions at other interfaces by the user is improved.
Optionally, the method further comprises:
step S4, in the case that the first input is the detection result information sent by the second device, closing or suspending the video playing interface in response to the first input, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device; the detection result information is used for representing that the second device detects that the video playing interface leaves the visual field range of the user.
In this embodiment of the present application, whether the video playing interface is within the user's field of view may be detected by the second device. For example, the second device may detect, through the camera of the AR glasses, whether the video playing interface is within the lens range of the camera, and accordingly determine whether the video playing interface is within the field of view of the user. Or the second device is an on-vehicle AR device, and the camera of the on-vehicle AR device can detect whether the first device is in the field of view of the user, and further determine whether the video playing interface on the first device is in the field of view of the user.
In this embodiment of the present application, the first device receives detection result information sent by the second device when detecting that the video playing interface leaves the user field of view, and may determine that the video playing interface has left the user field of view according to the detection result information. The second device may determine that the video playing interface on the first device is also out of the user field of view under the condition that the first device is detected to be out of the user field of view, so as to send detection result information to the first device.
Alternatively, the detection result information may be a time when the video playing interface leaves the user's field of view. The first device may determine, when receiving the detection result information, whether the video playing interface leaves the user's field of view according to the leaving time and the current time. Specifically, if the time difference between the departure time and the current time is less than a preset time difference threshold, for example, 1 minute, it may be determined that the video playing interface has left the user's field of view.
In this embodiment of the present application, in response to a first input, that is, in response to detection result information sent by a second device, the first device may directly close a video playing interface, or pause a video played on the video playing interface, and send video data of the video played by the video playing interface to the second device, so as to play the video on the second device. Specifically, the first device may directly close the video playing process, or the first device may control the video playing APP to pause playing the video, which is not limited in the embodiment of the present application. Under the condition of suspending playing of the video, when the user watches the video through the first device again, the user can watch the video continuously at the suspended video progress, and the user can select the video playing mode conveniently.
In this embodiment, the operation of sending the video data of the video played by the video playing interface to the second device to play the video on the second device may refer to the description related to step S3, which is not repeated herein.
In the embodiment of the application, under the condition that the first input is the detection result information sent by the second device, closing or suspending the video playing interface in response to the first input, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device; the detection result information is used for representing that the second device detects that the video playing interface leaves the visual field range of the user. Therefore, the user can watch the video through the second device continuously under the condition that the video playing interface of the first device leaves the visual field of the user, so that the user does not miss the content in the video, the user is ensured to watch the played video continuously, and the video playing effect is improved.
Optionally, the method further comprises:
step S5, receiving a second input.
Step S6, outputting wearing reminding information in response to the second input; the second input is wearing state information sent by the second device when the second device is detected to be in an unworn state, and the wearing reminding information is used for prompting a user to wear the second device so as to simultaneously watch a playing interface when the second device plays the video and the other interfaces displayed on the first device through the second device.
In an embodiment of the present application, the second device may be a wearable device, such as AR glasses. The second device can determine whether the second device is currently in a state of being worn by the user or not by detecting whether the state information of the user can be detected through an internally arranged sensor, and send wearing state information to the first device under the condition that the second device is detected to be in an unworn state.
In the embodiment of the application, the first device responds to the second input and outputs the wearing reminding information. The wearing reminding information may be generated according to the wearing state information, for example, the wearing state information may include location information of the second device, the wearing reminding information may be generated according to the location information in the wearing state information, and the generated wearing reminding information includes name identification and location information of the second device. The wearing reminding information can be text information and displayed on the current interface, or can be voice information which is played through a loudspeaker of the first device so as to prompt a user to find and wear the second device according to the wearing reminding information.
In this embodiment, when the user wears the second device, the second device may simultaneously watch the playing interface when the second device plays the video and other interfaces displayed by the first device. As shown in fig. 4, the mobile phone 20 and the AR glasses 30 are connected through the bluetooth module 40, 201 is a chat interface of the social APP, 301 is a video playing interface of the AR glasses 30, and 302 is a camera of the AR glasses 30. Referring to fig. 1 and fig. 4, in this scenario, the information on the chat interface 201 of the social APP displayed by the first device is real information, and may enter the human eye through the optical display 102. The image information on the video playing interface 301 of the AR glasses 30 is virtual information, and can be projected on the optical display 102, and the virtual information is sent to the human eye by reflection or total reflection. The user can watch the video on the video playing interface 301 through the AR glasses 30 at the same time, and operate on the chat interface 201, reply to a chat message, etc. Wherein camera 302 may detect whether handset 20 is within the user's field of view.
In an embodiment of the present application, by receiving a second input; outputting wearing reminding information in response to the second input; the second input is wearing state information sent by the second device when the second device is detected to be in an unworn state, and the wearing reminding information is used for reminding the user of wearing the second device. Therefore, the user can be conveniently reminded of wearing the second device, so that the user can watch the playing interface of the second device when the video is played and other interfaces displayed on the first device through the second device.
Optionally, in the case that the first input is an input for switching from the video playing interface to another interface, the playing interface when the second device plays the video is not overlapped with the other interface.
In the embodiment of the application, under the condition that the video playing interface is switched to other interfaces, the first device displays the other interfaces, the second device displays the video playing interface, and the position of the playing interface when the second device plays the video can be adjusted through the second device, so that the video playing interface is not overlapped with the other interfaces on the first device. The gesture of the user adjusting the playing interface can be captured through the camera of the second device, and the position of the video playing interface is correspondingly adjusted according to the moving direction and the distance of the gesture of the user, so that the video playing interface is not overlapped with the user operation area.
In the prior art, when a user switches to other interfaces, a video is played through a split screen or a floating window, so that the video playing interface becomes smaller, or the video playing interface is overlapped with the other interfaces, further, the user is influenced to operate on the other interfaces, and the video playing effect is also poor.
Optionally, in the case that the first input is detection result information sent by the second device, a playing interface when the second device plays the video is not overlapped with the user operation area.
In this embodiment of the present application, the user operation area may be a spatial area related to an operation currently performed by a user, for example, when the user learns to cook while watching a cooking video through the second device, the user cooking area may be used as the user operation area, and specifically, the user operation area may be a spatial area corresponding to a movement range of both hands of the user. This is by way of example only, and the embodiments of the present application are not limited thereto.
And under the condition that the video playing interface leaves the field of view of the user, the second device displays the video playing interface, and the user can watch the video and simultaneously perform other operations. The position of the playing interface when the second device plays the video can be adjusted through the second device, so that the video playing interface is not overlapped with the user operation area. Wherein the user operation area may be captured by a camera of the second device. Further, whether the user operation area and the video playing interface have a coincident part is judged. When the overlapped part exists, the second device automatically adjusts the position of the video playing interface so that the video playing interface is not overlapped with the user operation area.
In the embodiment of the application, the playing interface when the second device plays the video is not overlapped with other interfaces, or the playing interface when the second device plays the video is not overlapped with the user operation area, so that the user can not influence each other when watching the video and performing other operations. Therefore, the user can conveniently perform other operations, and better video playing effect is obtained through the playing interface of the second device.
Optionally, before step S3, the method further includes:
step S7, a control instruction is sent to the second equipment; the control instruction is used for indicating the second equipment to establish a virtual layer at a preset position with a specified size so as to display video frames included in the video data on the virtual layer.
Alternatively, the preset positions may be distinguished according to user gestures, such as standing, sitting, etc., each gesture corresponding to a preset position. The preset position of any gesture can be a fixed position with a certain distance and a certain height in front of the user, for example, a position with the front 0.5 m in front of the user being flush with the height of the eyes when sitting. If the second device is AR glasses and the first device is a mobile phone, the preset position may be a position 5 cm above the mobile phone. This is by way of example only, and the embodiments of the present application are not limited thereto.
Optionally, the specified size may be determined according to the size of the display screen of the first device, and specifically, the specified size may be N times the size of the display screen of the first device, where N is a natural number greater than or equal to 1, that is, the specified size is not less than the size of the playing interface of the first device. Alternatively, the specified size may be a predetermined size, for example, the size of the video playing interface of the first device is 7 inches, and the specified size may be 9 inches. This is by way of example only, and the embodiments of the present application are not limited thereto.
In this embodiment of the present application, the first device may send a control instruction to the second device through a wireless communication connection, such as bluetooth connection, established with the second device, so as to instruct the second device to establish the virtual layer at a preset location and in a specified size. The virtual layer refers to a virtual medium used by the second device to display a virtual picture. In this embodiment of the present invention, the second device may project the video frame information in the video data onto the optical display, and then send the video frame information to the human eye through reflection and total reflection, so that the human eye may display a virtual picture on the virtual layer, that is, display the video frame included in the video data in a preset position and with a specified size.
In the embodiment of the application, the control instruction for instructing the second device to establish the virtual layer at the preset position and the specified size is sent to the second device, so that the second device can be conveniently controlled to display the video frames included in the video data on the virtual layer, and the video is played through the second device for the user to watch.
It should be noted that, in the case where the first device establishes a wireless communication connection, such as a bluetooth connection, with the second device, the first device and the second device need to be within a range allowed by a communication distance, for example, the communication distance is typically 10 meters, and the first device and the second device may transmit instructions and data within the communication distance. After the communication distance is exceeded, the first device can output prompt information to remind a user of disconnection, and the user is prompted to adjust the distance between the first device and the second device, so that normal communication connection is restored.
The embodiment of the application provides another video playing method, which is applied to a second device, wherein the second device is in wireless communication connection with a first device, and the video playing method comprises the following steps:
step A1, receiving video data sent by the first equipment; the video data is the video data of the video played by the video playing interface sent to the second device, and the first device is used for displaying other interfaces, wherein the video data is the video data of the video played by the video playing interface, which is sent to the second device, when the first device displays the video playing interface, the first input is the input of switching from the video playing interface to the other interfaces.
Alternatively, the wireless communication connection of the second device with the first device may be a bluetooth connection, the second device receiving video data transmitted by the first device via bluetooth.
And step A2, playing video based on the video data.
In this embodiment of the present application, the second device may process the video data to obtain virtual image information, and project the virtual image information to an optical display of the second device through a micro-projection system of the second device to display the virtual image information, so as to play the video.
Optionally, the second device may adjust the size of the playing interface when playing the video, that is, the size of the virtual playing interface, so that the size of the virtual playing interface is not smaller than the size of the playing interface when playing the video on the first device.
In the embodiment of the application, video data sent by the first device is received; the video data is the video data of the video played by the video playing interface sent to the second device, wherein the first device is used for displaying other interfaces when the first device receives a first input under the condition that the video playing interface is displayed, and responds to the first input when the first input is the input of switching from the video playing interface to the other interfaces; playing the video based on the video data. Therefore, the user can conveniently watch the video played by the second device under the condition of operating other interfaces on the first device. The display area is not reduced without playing through the suspended small window, so that the display interface is convenient for a user to watch, partial shielding of the display interface is avoided, and the efficiency of processing transactions at other interfaces by the user is improved.
Optionally, the first device is further configured to close or pause the video playing interface in response to the first input if the first input is detection result information sent by the second device, and the method further includes:
and step A3, acquiring an image in the field of view of the user based on the camera of the second device.
Alternatively, the second device may be AR glasses, on which a camera is mounted. Under the condition that a user wears AR glasses, an image in the range of one lens can be shot directly through a camera, namely, the image in the range of the user's visual field is acquired, and a target image is obtained.
Alternatively, the second device may be a vehicle-mounted second device, and the vehicle-mounted second device is provided with a camera. And acquiring an image in a certain range in front of the user, namely an image in the field of view of the user, through a camera of the vehicle-mounted second equipment to obtain a target image. For example, an image in a sector area of 3 meters right in front of the user may be centered. This is by way of example only, and the embodiments of the present application are not limited thereto.
And step A4, under the condition that the video playing interface is not detected from the image, determining that the video playing interface leaves the user visual field range, and sending the detection result information to the first device.
In this embodiment of the present application, the second device may detect whether a video playing interface exists in the collected target image in the user field of view. Specifically, whether an object conforming to the feature of the pre-stored video playing interface exists in the target image or not can be detected through an image analysis technology, so that whether the video playing interface exists in the target image or not is determined.
In this embodiment of the present application, the second device may determine that the video playing interface leaves the user field of view under the condition that the video playing interface is not detected from the image. The second device may send the detection result information to the first device via a wireless communication connection, such as a bluetooth connection, established with the first device. The detection result information characterizes that the video playing interface leaves the visual field range of the user.
In the embodiment of the application, whether the video playing interface is in the user visual field range can be conveniently known by collecting the image in the user visual field range based on the camera of the second device. Under the condition that the video playing interface is not detected from the image, determining that the video playing interface leaves the user visual field range, and sending detection result information to the first device, the first device can conveniently close or pause the video playing interface according to the detection result information.
Optionally, the first device is further configured to receive a second input, and output wearing reminding information in response to the second input; the wearing reminding information is used for reminding a user to wear the second device so as to watch a playing interface when the second device plays the video and the other interfaces displayed on the first device through the second device at the same time, and the method further comprises the following steps:
and step A5, detecting whether the second equipment is currently in a wearing state.
In this embodiment of the present application, the second device may be a wearable device, for example, AR glasses, and may open a camera of the AR glasses, and determine, through a height, an angle, etc. of a field of view of the camera, whether the user wears the AR glasses. Alternatively, it may be determined that the user wears the second device by a sensor in the second device, such as a motion sensor or a temperature sensor, in case the pulse or body temperature of the user is detected. This is merely an illustration of the embodiments of the present application and is not intended to be limiting.
And step A6, transmitting the wearing state information to the first device when the second device detects that the second device is in the unworn state currently.
In this embodiment of the present application, in a case where it is determined that the user is not currently wearing the second device, the second device may send wearing state information to the first device through a wireless communication connection, such as a bluetooth connection, established with the first device.
In this embodiment, as shown in fig. 4, the AR glasses 30 may detect whether the AR glasses 30 are currently in a wearing state through the camera 302. When the user wears the AR glasses 30, the user can view the play interface 301 of the AR glasses 30 when playing the video, and at the same time, view and reply to the chat message on the chat interface 201 displayed on the mobile phone 20.
In the embodiment of the application, whether the second device is in the wearing state currently is detected; and sending wearing state information to the first device under the condition that the second device detects that the second device is in an unworn state currently. Therefore, the user can be conveniently reminded of wearing the second device, so that the user can continue watching the video through the second device while operating other interfaces of the first device.
Optionally, in the case that the first input is an input for switching from the video playing interface to another interface, the method further includes:
And step A7, under the condition that the coincidence of the playing interface and the other interfaces when the video is played is detected, adjusting the display position of the playing interface so that the playing interface is not coincident with the other interfaces.
In this embodiment of the present application, when the first input is an input for switching from the video playing interface to another interface, the first device displays the other interface, and the second device displays the video playing interface. Optionally, the second device may detect, through the camera, whether the video playing interface of the second device coincides with other interfaces of the first device, and when detecting that the playing interface coincides with the other interfaces when the video is played, may adjust a display position of the playing interface through the second device. For example, the video playing interface may be controlled to move a certain distance to the outside of the first device, so that the video playing interface is not overlapped with other interfaces.
Optionally, the second device may start the gesture recognition function when the camera detects that the playing interface overlaps with other interfaces when the video is played, capture a gesture action of the user through the camera, and adjust a display position of the playing interface according to the gesture action. Referring to fig. 4, 6 and 7, 201 is a social APP chat interface of mobile phone 20, 301 is a video playing interface of AR glasses 30, 501 is a gesture action one of the user, and 502 is a gesture action two of the user.
As shown in fig. 6, when a user's finger is placed on the video playing interface 301 to make a gesture of movement, when the camera 302 on the AR glasses 30 monitors the gesture 501, the video playing interface 301 is moved to a new display position according to the movement path of the gesture 501, so that the playing interface is not overlapped with other interfaces.
As shown in fig. 7, when a user's finger is placed on the video playing interface 301 to make a zooming gesture, when the camera 302 on the AR glasses 30 monitors the gesture 502, the video playing interface 301 is zoomed in or zoomed out according to the zoom-in or zoom-out action range of the gesture 502, so that the playing interface is not overlapped with other interfaces, and meets the requirement of the user on the interface size of the video playing interface 301.
Optionally, the second device may capture, through the camera, a video playing control action of the user on the video playing interface, for example, pause, play, fast forward, fast backward, and the like, and analyze, through an image processing technology, a gesture action of the user, and further perform corresponding control on the played video according to the gesture of the user.
In the case that the first input is detection result information sent by the second device, the method further includes:
And step A8, detecting the current user operation area of the user based on the camera of the second device.
In this embodiment of the present application, when the video playing interface leaves the field of view of the user, the second device displays the video playing interface, and the user may perform other operations while watching the video. The second device may detect a current user operation area of the user through the camera. Specifically, a gesture action of a user can be captured through a camera, an image in the current user visual field range is collected, the operation range of the user is analyzed according to the gesture action and the image, and a user operation area is generated according to an analysis result. For example, the user operation area may be a cube area in front of the user.
And step A9, adjusting the display position of the play interface so that the play interface is not overlapped with the user operation area under the condition that the play interface is overlapped with the user operation area when the video is played.
In the embodiment of the application, the current motion of the user can be captured through the camera of the second device, and the user operation area is determined according to the captured user motion. Further, whether the user operation area and the video playing interface have a coincident part is judged.
Optionally, when there is a portion that overlaps, the position of the video playing interface may be adjusted by the second device, for example, the video playing interface may be controlled to move a certain distance to the outside of the operation area, so that the video playing interface is not overlapped with other interfaces.
Optionally, when there is a coincident portion, the second device may activate a gesture recognition function, capture a gesture action of the user through the camera, and adjust a display position of the playing interface according to the gesture action. The gesture action includes moving a position or zooming in or out of the video playing interface, and the specific implementation manner may refer to the implementation manner of step A7, which is not described herein.
In this embodiment of the present application, when the first input is an input for switching from the video playing interface to another interface, the display position of the playing interface is adjusted; detecting a current user operation area of a user based on a camera of the second device under the condition that the first input is detection result information sent by the second device; and adjusting the display position of the play interface under the condition that the play interface is overlapped with the user operation area when the video is played. The display position of the video playing interface can be adjusted under the condition that the video playing interface of the second device is overlapped with other interfaces on the first device or the user operation area is overlapped, so that the video playing interface is not affected when the user watches videos or performs other operations, the user can conveniently perform other operations, and better video playing effect is obtained through the playing interface of the second device.
Optionally, step A2 may include the steps of:
and step A21, establishing a virtual layer at a preset position in a specified size, and displaying video frames included in the video data on the virtual layer.
In this embodiment of the present application, the second device establishes a virtual layer for displaying a virtual picture at a preset position and with a specified size. Further, the video frame information in the video data is projected onto an optical display of the second device, and then the video frame information is sent to human eyes through reflection and total reflection, so that the human eyes can display virtual pictures on the virtual image layer, namely, video frames included in the video data are displayed at preset positions in a specified size.
In the embodiment of the application, the virtual layer is built at a preset position with a specified size, and the video frames included in the video data are displayed on the virtual layer. In this way, the video can be conveniently played in the preset position by the second device in the appointed size, so that a better video playing effect can be obtained.
Fig. 8 is a flow chart illustrating switching from the video playing interface to other interfaces according to the embodiment of the present application. As shown in fig. 8, the mobile phone and the AR glasses can interact through bluetooth connection when the video is played, and the mobile phone does not interact with the AR glasses when no picture switching is detected. When the mobile phone detects that the user plays the video, under the condition of switching the APP, an instruction is sent to the AR glasses to instruct the AR glasses to establish a virtual image layer, a video source selected to be played by the user, such as the video being played by the mobile phone, the mobile phone sends video playing picture information to the AR glasses through frame transmission, the AR glasses display a virtual picture in a default size at a default position based on the virtual image layer, and the video information is displayed on the virtual picture and presented to the user. In addition, the camera of the AR glasses is started, a gesture recognition function is started, gesture actions of a user are monitored in real time, and when the fact that the user makes specified gesture actions is detected, the position, the size and the like of the virtual picture are adjusted according to the gesture actions.
Fig. 9 is a schematic flow chart of a video playing interface leaving a user's field of view according to an embodiment of the present application. As shown in fig. 9, when the mobile phone plays the video, the user wears the AR glasses, and opens the camera of the AR glasses to identify whether the mobile phone is in the line of sight of the user in the real world, and if the mobile phone is in the line of sight of the user, the mobile phone continues playing the video. It should be noted that, in the case of wearing AR glasses, real information in the real world can be obtained by eyes of the user, and the experience of the user is very little different from that of the case of not wearing AR glasses. If the mobile phone is not in the sight of the user, namely the video playing interface leaves the sight of the user, whether the user plays the video through the AR glasses is determined. If the user determines to play the video using the virtual frame of the AR glasses, the detailed implementation will be described with reference to fig. 8, which is not repeated here.
It should be noted that, interaction between the first device and the second device may be adaptively adjusted according to the use states of the first device and the second device, for example, when the electric quantity of the mobile phone is insufficient, the mobile phone plays video through the AR glasses, only needs to send video data to the second device, so that the standby time of the mobile phone may be prolonged and the user may obtain a better video playing effect. The second equipment is not required to be connected with the mobile phone all the time, connection is established and video is played only when the mobile phone is switched to other interfaces or the mobile phone leaves the visual field of the user, and the user can disconnect the connection of the second equipment when the user needs to return to the mobile phone to watch the video, so that the power consumption of the second equipment can be reduced, and the electric quantity is saved to improve the endurance.
The embodiment of the application provides a video playing method, which is applied to a system, wherein the system comprises a first device and a second device, and the video playing method comprises the following steps:
the first device displays a video playing interface, receives a first input, responds to the first input when the first input is the input of switching from the video playing interface to other interfaces, displays the other interfaces on the first device, and sends video data of video played by the video playing interface to the second device so as to play the video on the second device;
the second device receives the video data sent by the first device and plays the video based on the video data.
The implementation steps of the video playing method may refer to the implementation steps of the video playing method described above, and are not described herein again. The video playing method has the same advantages as those of the video playing method described above relative to the prior art, and will not be described in detail herein.
The embodiment of the application provides a video playing method and apparatus, which is applied to a first device, as shown in fig. 10, where the video playing apparatus 60 includes:
A first playing module 601, configured to display a video playing interface on the first device;
a first receiving module 602 for receiving a first input;
and the first execution module 603 is configured to, in response to the first input, display other interfaces on the first device and send video data of a video played by the video playing interface to the second device, so as to play the video on the second device, where the first input is an input for switching from the video playing interface to the other interfaces.
Optionally, the apparatus 60 further includes:
the second execution module is used for responding to the first input, closing or suspending the video playing interface and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device when the first input is the detection result information sent by the second device; the detection result information is used for representing that the second device detects that the video playing interface leaves the visual field range of the user.
Optionally, the apparatus 60 further includes:
a second receiving module for receiving a second input;
The output module is used for responding to the second input and outputting wearing reminding information; the second input is wearing state information sent by the second device when the second device is detected to be in an unworn state, and the wearing reminding information is used for prompting a user to wear the second device so as to simultaneously watch a playing interface when the second device plays the video and the other interfaces displayed on the first device through the second device.
Optionally, in the case that the first input is an input for switching from the video playing interface to another interface, the playing interface when the second device plays the video is not overlapped with the other interface;
and under the condition that the first input is the detection result information sent by the second equipment, a playing interface when the second equipment plays the video is not overlapped with a user operation area.
Optionally, the apparatus 60 further includes:
a sending module, configured to send a control instruction to the second device before the first executing module 603 sends video data of the video played by the video playing interface to the second device; the control instruction is used for indicating the second equipment to establish a virtual layer at a preset position with a specified size so as to display video frames included in the video data on the virtual layer.
The video playing method and the video playing method have the same advantages as those of the video playing method described above relative to the prior art, and are not described herein again.
The embodiment of the application provides another video playing method and apparatus, which is applied to a second device, where the video playing apparatus 70 includes:
a receiving module 701, configured to receive video data sent by the first device; the video data is the video data of the video played by the video playing interface sent to the second device, wherein the first device is used for displaying other interfaces when the first device receives a first input under the condition that the video playing interface is displayed, and responds to the first input when the first input is the input of switching from the video playing interface to the other interfaces;
a playing module 702, configured to play video based on the video data.
Optionally, the first device is further configured to close or pause the video playing interface in response to the first input if the first input is detection result information sent by the second device, and the apparatus 70 further includes:
the acquisition module is used for acquiring images in the visual field range of the user based on the camera of the second equipment;
And the determining module is used for determining that the first device leaves the user visual field range and sending the detection result information to the first device under the condition that the first device is not detected from the image.
Optionally, the first device is further configured to receive a second input, and output wearing reminding information in response to the second input; wherein, for the wearing state information sent by the second device when the second device is detected to be in the currently unworn state, the wearing reminding information is used for prompting the user to wear the second device, so as to simultaneously watch, through the second device, a playing interface when the second device plays the video and the other interfaces displayed on the first device, the apparatus 70 further includes:
the first detection module is used for detecting whether the second equipment is in a wearing state currently;
the sending module is used for sending the wearing state information to the first device when the second device detects that the second device is in the unworn state currently.
Optionally, the apparatus 70 further includes:
the first adjusting module is used for adjusting the display position of the playing interface so that the playing interface is not overlapped with other interfaces under the condition that the overlapping of the playing interface and the other interfaces when the video is played is detected under the condition that the first input is the input for switching from the video playing interface to the other interfaces.
Optionally, the apparatus 70 further includes:
the second detection module is further configured to detect, based on a camera of the second device, a current user operation area of the user when the first input is detection result information sent by the second device;
and the second adjusting module is also used for adjusting the display position of the playing interface so that the playing interface is not overlapped with the user operation area under the condition that the playing interface is overlapped with the user operation area when the video is played.
Optionally, the playing module 702 is specifically configured to establish a virtual layer at a preset position with a specified size, and display a video frame included in the video data on the virtual layer.
The video playing device has the same advantages as the video playing method described above with respect to the prior art, and will not be described here again.
The video playing device in the embodiment of the application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device may be a mobile electronic device or a non-mobile electronic device. By way of example, the mobile electronic device may be a cell phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, wearable device, ultra-mobile personal computer (ultra-mobile personal computer, UMPC), netbook or personal digital assistant (personal digital assistant, PDA), etc., and the non-mobile electronic device may be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The video playing device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The video playing device provided in the embodiment of the present application can implement each process implemented by the embodiments of the methods of fig. 2 to 9, and in order to avoid repetition, a detailed description is omitted here.
Optionally, as shown in fig. 12, the embodiment of the present application further provides an electronic device 80, including a processor 801 and a memory 802, where a program or an instruction capable of running on the processor 801 is stored in the memory 802, and the program or the instruction implements each step of the embodiment of the video playing method when executed by the processor 801, and the steps can achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 13 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 90 includes, but is not limited to: radio frequency unit 901, network module 902, audio output unit 903, input unit 904, sensor 905, display unit 906, user input unit 907, interface unit 908, memory 909, and processor 910.
Those skilled in the art will appreciate that the electronic device 90 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 910 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 13 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
It should be appreciated that in embodiments of the present application, the input unit 904 may include a graphics processor (Graphics Processing Unit, GPU) 9041 and a microphone 9042, with the graphics processor 9041 processing image data of still pictures or video obtained by an image capture device (e.g., a camera) in a video capture mode or an image capture mode. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 907 includes at least one of a touch panel 9071 and other input devices 9072. Touch panel 9071, also referred to as a touch screen. The touch panel 9071 may include two parts, a touch detection device and a touch controller. Other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 909 may be used to store software programs as well as various data. The memory 909 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 909 may include a volatile memory or a nonvolatile memory, or the memory 909 may include both volatile and nonvolatile memories. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 909 in embodiments of the present application includes, but is not limited to, these and any other suitable types of memory.
Processor 910 may include one or more processing units; optionally, the processor 910 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 910.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction realizes each process of the embodiment of the video playing method, and the same technical effects can be achieved, so that repetition is avoided, and no description is repeated here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or instructions, implementing each process of the video playing method embodiment, and achieving the same technical effect, so as to avoid repetition, and no redundant description is provided herein.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the video playing method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. A video playing method, applied to a first device, where the first device is connected to a second device in a wireless communication manner, the video playing method comprising:
displaying a video playing interface on the first device;
receiving a first input;
and under the condition that the first input is the input of switching from the video playing interface to other interfaces, responding to the first input, displaying the other interfaces on the first device, and sending video data of the video played by the video playing interface to the second device so as to play the video on the second device.
2. The method according to claim 1, wherein the method further comprises:
if the first input is detection result information sent by the second device, closing or suspending the video playing interface in response to the first input, and sending video data of video played by the video playing interface to the second device so as to play the video on the second device;
the detection result information is used for representing that the second device detects that the video playing interface leaves the visual field range of the user.
3. The method according to claim 1, wherein the method further comprises:
receiving a second input;
outputting wearing reminding information in response to the second input;
the second input is wearing state information sent by the second device when the second device is detected to be in an unworn state, and the wearing reminding information is used for prompting a user to wear the second device so as to simultaneously watch a playing interface when the second device plays the video and the other interfaces displayed on the first device through the second device.
4. A method according to any one of claims 1-3, wherein in the event that the first input is an input switching from the video playing interface to another interface, the playing interface of the second device when playing the video is not coincident with the other interface;
and under the condition that the first input is the detection result information sent by the second equipment, a playing interface when the second equipment plays the video is not overlapped with a user operation area.
5. A method according to any one of claims 1-3, wherein before said sending video data of video played by said video playing interface to said second device, said method further comprises:
Sending a control instruction to the second equipment; the control instruction is used for indicating the second equipment to establish a virtual layer at a preset position with a specified size so as to display video frames included in the video data on the virtual layer.
6. A video playing method, applied to a second device, where the second device is connected to a first device in a wireless communication manner, the video playing method comprising:
receiving video data sent by the first equipment; the video data is the video data of the video played by the video playing interface sent to the second device, wherein the first device is used for displaying other interfaces when the first device receives a first input under the condition that the video playing interface is displayed, and responds to the first input when the first input is the input of switching from the video playing interface to the other interfaces;
playing the video based on the video data.
7. The method of claim 6, wherein the first device is further configured to close or pause the video playback interface in response to the first input if the first input is detection result information sent by the second device, the method further comprising:
Acquiring an image in a user field of view based on a camera of the second device;
and under the condition that the video playing interface is not detected from the image, determining that the video playing interface leaves the visual field range of the user, and sending the detection result information to the first device.
8. A video playback apparatus applied to a first device, the first device being connected to a second device by wireless communication, the video playback apparatus comprising:
the first playing module is used for displaying a video playing interface on the first equipment;
a first receiving module for receiving a first input;
and the first execution module is used for responding to the first input, displaying other interfaces on the first equipment and sending video data of the video played by the video playing interface to the second equipment so as to play the video on the second equipment when the first input is the input for switching from the video playing interface to the other interfaces.
9. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the video playback method of any one of claims 1 to 7.
10. A readable storage medium, wherein a program or instructions is stored on the readable storage medium, which when executed by a processor, implements the steps of the video playback method of any one of claims 1-7.
CN202211691362.0A 2022-12-26 2022-12-26 Video playing method, device, electronic equipment and readable storage medium Pending CN116033227A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211691362.0A CN116033227A (en) 2022-12-26 2022-12-26 Video playing method, device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211691362.0A CN116033227A (en) 2022-12-26 2022-12-26 Video playing method, device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN116033227A true CN116033227A (en) 2023-04-28

Family

ID=86090722

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211691362.0A Pending CN116033227A (en) 2022-12-26 2022-12-26 Video playing method, device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116033227A (en)

Similar Documents

Publication Publication Date Title
US10416789B2 (en) Automatic selection of a wireless connectivity protocol for an input device
EP2613224B1 (en) Mobile terminal and control method therof
EP3299946B1 (en) Method and device for switching environment picture
CN104115100A (en) Head-mounted display, program for controlling head-mounted display, and method of controlling head-mounted display
CN113938748B (en) Video playing method, device, terminal, storage medium and program product
CN110300274B (en) Video file recording method, device and storage medium
KR20140141100A (en) Method and apparatus for protecting eyesight
CN104169838A (en) Eye tracking based selectively backlighting display
CN108628515B (en) Multimedia content operation method and mobile terminal
US10474324B2 (en) Uninterruptable overlay on a display
CN113490010B (en) Interaction method, device and equipment based on live video and storage medium
US20230152956A1 (en) Wallpaper display control method and apparatus and electronic device
CN103595984A (en) 3D glasses, a 3D display system, and a 3D display method
CN112363658B (en) Interaction method and device for video call
CN104270623A (en) Display method and electronic device
CN112214112A (en) Parameter adjusting method and device
CN109151176A (en) A kind of information acquisition method and terminal
CN111541928A (en) Live broadcast display method, device, equipment and storage medium
CN110751931B (en) Page refreshing method and device for ink screen
CN112702533B (en) Sight line correction method and sight line correction device
CN107979701B (en) Method and device for controlling terminal display
CN114077465A (en) UI (user interface) rendering method and device, electronic equipment and storage medium
EP4125274A1 (en) Method and apparatus for playing videos
CN112673612A (en) Display mode switching method, device, equipment and medium
CN116033227A (en) Video playing method, device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination