CN114860178A - Screen projection method and electronic equipment - Google Patents

Screen projection method and electronic equipment Download PDF

Info

Publication number
CN114860178A
CN114860178A CN202110061435.7A CN202110061435A CN114860178A CN 114860178 A CN114860178 A CN 114860178A CN 202110061435 A CN202110061435 A CN 202110061435A CN 114860178 A CN114860178 A CN 114860178A
Authority
CN
China
Prior art keywords
electronic device
camera
preset
image
electronic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110061435.7A
Other languages
Chinese (zh)
Inventor
胡靓
徐杰
吴思举
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110061435.7A priority Critical patent/CN114860178A/en
Priority to PCT/CN2022/071643 priority patent/WO2022152174A1/en
Publication of CN114860178A publication Critical patent/CN114860178A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1454Digital output to display device ; Cooperation and interconnection of the display device with other functional units involving copying of the display data of a local workstation or window to a remote workstation or window so that an actual copy of the data is displayed simultaneously on two or more displays, e.g. teledisplay
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3206Monitoring of events, devices or parameters that trigger a change in power modality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/26Power supply means, e.g. regulation thereof
    • G06F1/32Means for saving power
    • G06F1/3203Power management, i.e. event-based initiation of a power-saving mode
    • G06F1/3234Power saving characterised by the action undertaken
    • G06F1/3293Power saving characterised by the action undertaken by switching to a less power-consuming processor, e.g. sub-CPU
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Abstract

The application provides a screen projection method, which is characterized by comprising the following steps: the method comprises the steps that multimedia content is played by first electronic equipment, and a camera of the first electronic equipment runs in a low power consumption mode; the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode; when the at least one first image frame is detected to comprise a preset object, the camera of the first electronic equipment operates in a normal working mode, and at least one second image frame is collected; when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device, and the multimedia content is sent to the second electronic device, so that the second electronic device plays the multimedia content. The screen projection method is beneficial to improving user experience.

Description

Screen projection method and electronic equipment
Technical Field
The present application relates to the field of electronic devices, and more particularly, to a method of projecting a screen and an electronic device.
Background
With the proliferation of consumer electronic devices, data sharing among devices has become more frequent. The mobile phone can play videos and also can display documents, pictures, application interfaces or web pages. Because the display screen of the mobile phone is small, when the content displayed by the mobile phone needs to be seen by others, the content displayed by the mobile phone can be projected onto other electronic equipment (such as a television, a computer or another mobile phone) through a screen projection technology, for example, a user can project a video on the mobile phone onto the television, and for example, project a song onto a sound box.
Disclosure of Invention
The application provides a screen projection method and electronic equipment, which are beneficial to improving user experience.
In a first aspect, a method of projecting a screen is provided, the method comprising: the method comprises the steps that multimedia content is played by first electronic equipment, and a camera of the first electronic equipment runs in a low power consumption mode; the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode; when the at least one first image frame is detected to comprise a preset object, a camera of the first electronic equipment operates in a normal mode, and at least one second image frame is collected; when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device, and the multimedia content is sent to the second electronic device, so that the second electronic device plays the multimedia content.
The first electronic equipment uses the low-power-consumption camera to collect images and detects that the images contain preset electronic equipment, so that screen projection is triggered, when a user shares content among different electronic equipment, the user can directly interact with the shared equipment, the problem that the user cannot find the shared equipment without knowing the name of the equipment is solved, and the content sharing among the devices is more direct and natural. In addition, the problem of one-step sharing no matter in short distance or long distance is solved, and content sharing can be achieved without walking to the shared equipment.
With reference to the first aspect, in some possible implementation manners of the first aspect, the camera is a front camera.
In most scenes, the front-facing camera of the user faces the face of the user, and the front-facing camera is adopted to collect images in a low power consumption mode, so that the camera used for face recognition can be multiplexed; on the other hand, if the rear camera is adopted, because the switching of the environment images collected by the rear camera is too fast, the first electronic device continuously collects different images and detects whether the preset electronic device is included, so that too much calculation pressure is caused to the first electronic device. Therefore, the user experience can be improved and the power consumption can be saved by adopting the front camera to collect images.
With reference to the first aspect, in some possible implementation manners of the first aspect, the operating a camera of the first electronic device in a normal operation mode includes: the camera of the first electronic device acquires image frames in a high-resolution mode.
After the acquired images are detected to contain the preset electronic equipment, the images are acquired by switching the high-resolution mode, and the success rate of image identification can be improved.
With reference to the first aspect, in some possible implementation manners of the first aspect, the acquiring, by a camera of the first electronic device, at least one first image frame in a low power consumption operation mode includes: the frame rate of at least one first image frame collected by the camera in the low-power-consumption operation mode is not more than 10 frames per second.
The frame rate of at least one first image frame collected by the camera of the first electronic device in the low-power-consumption operation mode is not more than 10 frames per second, so that power consumption can be reduced.
With reference to the first aspect, in some possible implementation manners of the first aspect, the detecting that the preset object is a preset second electronic device includes: and carrying out image recognition on the acquired image, and recognizing that the object in the acquired image is a preset second electronic device.
The collected images are detected through an image recognition algorithm, and the efficiency of image detection can be improved.
With reference to the first aspect, in some possible implementations of the first aspect, the image captured by the camera may be subjected to image recognition by a neural network computing processor of the first electronic device.
Through the neural network computing processor of the camera, the images shot by the camera are quickly processed for image recognition, the images can be recognized in real time, the images can be recognized when no network exists, and the recognition efficiency can be improved.
With reference to the first aspect, in some possible implementations of the first aspect, the acquired image may be uploaded to a server for image recognition, where the server may recognize the image through a neural network computing processor.
The image recognition is carried out on the image collected by the camera through the neural network computing processor of the server, so that the computing advantage of the server can be comprehensively exerted, and the computing pressure of the computer is reduced.
With reference to the first aspect, in some possible implementation manners of the first aspect, the performing image recognition on the acquired image, and recognizing that an object in the acquired image is a preset second electronic device, may further include:
matching the image recognition result with the mapping table of the electronic equipment and matching the image recognition result with the preset second electronic equipment
By establishing the electronic equipment mapping table and matching the image recognition result with the electronic equipment mapping table, the probability of misjudgment caused by image recognition errors can be reduced.
With reference to the first aspect, in some possible implementations of the first aspect, the electronic device mapping table includes:
the image of the preset second electronic device and the device information of the preset second electronic device are obtained, wherein the device information comprises the name or the device model of the second electronic device.
The preset device information of the second electronic device is recorded in the mapping table of the electronic device, so that the electronic device can be conveniently connected with the preset electronic device.
With reference to the first aspect, in some possible implementation manners of the first aspect, the mapping table of the electronic device may be preset by a user in advance, may be generated by the electronic device according to a screen-throwing usage habit of the user, and may be set by the manufacturer of the electronic device before the electronic device leaves a factory.
With reference to the first aspect, in some possible implementations of the first aspect, before the camera of the first electronic device acquires at least one first image frame in the low power consumption operation mode, the method further includes:
and the second processor of the local machine sends a first message to the first processor of the local machine to control the camera to acquire images.
The first electronic device processor may include a first processor and a second processor, and the first processor may be an auxiliary operation chip, or may also be a coprocessor or an auxiliary processor. By adopting the coprocessor to execute different tasks, the load of an application processor of the first electronic device can be reduced, so that the load of a main processor is reduced, the endurance time is further prolonged, and the low-power-consumption mode image can be received and identified at any time with lower power consumption.
With reference to the first aspect, in some possible implementation manners of the first aspect, before the first electronic device establishes a connection with the second electronic device, the method further includes: the first electronic device searches for and discovers the second electronic device.
With reference to the first aspect, in some possible implementation manners of the first aspect, the searching, by the first electronic device, for the second electronic device includes one or more of the following: discovering a third electronic device through broadcasting of a local area network, and comparing device information of the second electronic device with device information of the third electronic device, or discovering the third electronic device through Bluetooth, and comparing device information of the second electronic device with device information of the third electronic device, or discovering the third electronic device through WIFI direct connection, and comparing device information of the second electronic device with device information of the third electronic device;
through the difference of scenes, the first electronic device can dynamically select different modes to search and discover the preset electronic device, so that the data transmission stability can be enhanced, and the searching efficiency is improved.
With reference to the first aspect, in some possible implementation manners of the first aspect, the establishing, by the first electronic device, a connection with the second electronic device includes: the first electronic device establishes a P2P connection with the second electronic device.
By establishing a P2P link and performing screen projection service on a P2P channel, the anti-interference capability can be enhanced, and the screen projection experience is improved.
With reference to the first aspect, in some possible implementations of the first aspect, the multimedia content includes: at least one of video, audio, or pictures;
in a second aspect, an electronic device is provided, comprising: the camera comprises a low power consumption mode and a normal working mode; at least one processor; a memory storing instructions that, when executed by the at least one processor, cause the electronic device to perform the method of the first aspect described above and of the first aspect for enabling directional screen projection.
In a third aspect, a method of projecting a screen is provided, the method comprising: a camera of a first electronic device operates in a low power mode; the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode; when the at least one first image frame is detected to comprise a preset object, a camera of the first electronic equipment operates in a normal mode, and at least one second image frame is collected; when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device; the first electronic equipment receives a multimedia file sent by the second electronic equipment, wherein the content of the multimedia file is the multimedia content played by the second electronic equipment; and the first electronic equipment plays the multimedia file.
In a fourth aspect, an electronic device is provided, comprising: the camera comprises a low power consumption mode and a normal working mode; at least one processor; a memory storing instructions that, when executed by the at least one processor, cause the electronic device to perform the method of screen projection of the third aspect described above.
In a fifth aspect, a screen projection system is provided, which includes a first electronic device and a second electronic device; the camera of the first electronic device operates in a low power mode; the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode; when the at least one first image frame is detected to comprise a preset object, a camera of the first electronic equipment operates in a normal mode, and at least one second image frame is collected; when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device; the second electronic equipment sends the multimedia content which is being played to the first electronic equipment; and the first electronic equipment plays the multimedia content.
In a sixth aspect, a computer storage medium is provided that includes computer instructions that, when executed on an electronic device, cause the electronic device to perform a method of screen projection in any of the possible designs of any of the above aspects.
In a seventh aspect, a computer program product is provided, which when run on a computer causes the computer to perform a method for causing the electronic device to perform a screen projection as in any one of the possible designs of the above aspects.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application.
Fig. 2 is a block diagram of a software structure of an electronic device according to an embodiment of the present application.
FIG. 3 is a schematic system block diagram provided by an embodiment of the present application
Fig. 4A is a schematic diagram of a graphical user interface provided by an embodiment of the present application.
Fig. 4B is a schematic diagram of a graphical user interface provided by an embodiment of the present application.
Fig. 4C is a schematic diagram of a graphical user interface provided by an embodiment of the present application.
Fig. 5 is a schematic diagram of a graphical user interface provided by an embodiment of the present application.
Fig. 6 is a schematic diagram of a graphical user interface provided by an embodiment of the present application.
Fig. 7A-7C are schematic diagrams of a screen projection method provided by an embodiment of the present application.
Fig. 8 is a schematic diagram of a screen projection method provided by an embodiment of the present application.
Fig. 9 is a schematic diagram of a screen projection method provided in an embodiment of the present application.
Fig. 10-11 are schematic diagrams of graphical user interfaces provided by embodiments of the present application.
Fig. 12-14 are schematic flow charts provided by embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Fig. 1 illustrates a schematic structural diagram of an electronic device 100 having a display screen and at least one camera (e.g., a front camera and/or a rear camera).
The electronic device 100 may include at least one of a mobile phone, a foldable electronic device, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Artificial Intelligence (AI) device, a wearable device, a vehicle-mounted device, a smart home device, or a smart city device. The embodiment of the present application does not particularly limit the specific type of the electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) connector 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the processors may include a first processor 1101 (e.g., a co-processor), a second processor 1102 (e.g., an application processor).
The processor can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 may be a cache memory. The memory may store instructions or data that have been used or used more frequently by the processor 110. If the processor 110 needs to use the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc. The processor 110 may be connected to modules such as a touch sensor, an audio module, a wireless communication module, a display, a camera, etc. through at least one of the above interfaces.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The USB connector 130 is an interface conforming to the USB standard specification, and may be used to connect the electronic device 100 and a peripheral device, and specifically may be a Mini USB connector, a Micro USB connector, a USB Type C connector, and the like. The USB connector 130 may be used to connect a charger to charge the electronic device 100, or may be used to connect other electronic devices to transmit data between the electronic device 100 and other electronic devices. Or may be used to connect a headset through which audio stored in the electronic device is output. The connector can also be used to connect other electronic devices, such as VR devices and the like. In some embodiments, the standard specifications for the universal serial bus may be USB1.x, USB2.0, USB3.x, and USB 4.
The charging management module 140 is used for receiving charging input of the charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Bluetooth Low Energy (BLE), Ultra Wide Band (UWB), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), short-range wireless communication (NFC), infrared (infrared, IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other electronic devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement display functions via the GPU, the display screen 194, and the application processor, among others. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The electronic device 100 may implement a camera function through the camera module 193, the ISP, the video codec, the GPU, the display screen 194, the application processor AP, the neural network processor NPU, and the like.
The camera module 193 can be used to collect color image data and depth data of a subject. The ISP can be used to process color image data collected by the camera module 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera module 193.
In some embodiments, the camera module 193 may be composed of a color camera module and a 3D sensing module.
In some embodiments, the light sensing element of the camera of the color camera module may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats.
In some embodiments, the 3D sensing module may be a (time of flight) 3D sensing module or a structured light (structured light)3D sensing module. The structured light 3D sensing is an active depth sensing technology, and the basic components of the structured light 3D sensing module may include an Infrared (infra) emitter, an IR camera module, and the like. The working principle of the structured light 3D sensing module is that light spots (patterns) with specific patterns are transmitted to a shot object, light spot pattern codes (light coding) on the surface of the object are received, the difference and the similarity of the original projected light spots are compared, and the three-dimensional coordinates of the object are calculated by utilizing the trigonometric principle. The three-dimensional coordinates include the distance from the electronic device 100 to the object to be photographed. The TOF 3D sensing module may be an active depth sensing technology, and the basic components of the TOF 3D sensing module may include an Infrared (infra) emitter, an IR camera module, and the like. The working principle of the TOF 3D sensing module is to calculate the distance (i.e. depth) between the TOF 3D sensing module and the object to be photographed through the time of infrared ray foldback so as to obtain a 3D depth-of-field map.
The structured light 3D sensing module can also be applied to the fields of face recognition, motion sensing game machines, industrial machine vision detection and the like. The TOF 3D sensing module can also be applied to the fields of game machines, Augmented Reality (AR)/Virtual Reality (VR), and the like.
In other embodiments, the camera module 193 may also be composed of two or more cameras. The two or more cameras may include color cameras that may be used to collect color image data of the object being photographed. The two or more cameras may employ stereo vision (stereo vision) technology to acquire depth data of a photographed object. The stereoscopic vision technology is based on the principle of human eye parallax, and obtains distance information, i.e., depth information, between the electronic device 100 and an object to be photographed by photographing images of the same object from different angles through two or more cameras under a natural light source and performing calculations such as triangulation.
In some embodiments, the electronic device 100 may include 1 or more camera modules 193. Specifically, the electronic device 100 may include 1 front camera module 193 and 1 rear camera module 193. The front camera module 193 can be generally used to collect the color image data and depth data of the photographer facing the display screen 194, and the rear camera module can be used to collect the color image data and depth data of the photographed object (such as people and scenery) facing the photographer.
In some embodiments, the CPU or GPU or NPU in the processor 110 may process the color image data and depth data acquired by the camera module 193. In some embodiments, the NPU may identify color image data collected by the camera module 193 (specifically, the color camera module) through a neural network algorithm, such as a convolutional neural network algorithm (CNN), based on which the bone point identification technique is based, to determine the bone points of the person being photographed. The CPU or GPU may also run neural network algorithms to determine skeletal points of the captured person from the color image data. In some embodiments, the CPU or the GPU or the NPU may also be configured to determine the size of the person to be photographed (e.g., the body proportion, the thickness of the body part between the bone points) according to the depth data collected by the camera module 193 (which may be a 3D sensing module) and the identified bone points, and further determine a body beautification parameter for the person to be photographed, and finally process the photographed image of the person to be photographed according to the body beautification parameter, so as to beautify the body shape of the person to be photographed in the photographed image. In the following embodiments, how to perform the body beautifying processing on the image of the person to be shot based on the color image data and the depth data acquired by the camera module 193 will be described in detail, which is not repeated herein.
The digital signal processor is used for processing digital signals, and can also process other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in the external memory card. Or files such as music, video, etc. are transferred from the electronic device to the external memory card.
The internal memory 121 may be used to store computer executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a Universal Flash Storage (UFS), and the like. The processor 110 performs various functional methods or data processing of the electronic device 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or output an audio signal for handsfree phone call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, controls a lens to move in a reverse direction to counteract the shake of the electronic device 100, and thus achieves anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation based on barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. When the electronic device is a foldable electronic device, the magnetic sensor 180D may be used to detect the folding or unfolding, or the folding angle of the electronic device. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When the intensity of the detected reflected light is greater than a threshold value, it may be determined that there is an object near the electronic device 100. When the intensity of the detected reflected light is less than the threshold, the electronic device 100 may determine that there is no object near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L may be used to sense ambient light levels. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is obscured, such as when the electronic device is in a pocket. When the electronic equipment is detected to be shielded or in a pocket, part of functions (such as a touch function) can be in a disabled state to prevent misoperation.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature detected by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in the performance of the processor in order to reduce the power consumption of the electronic device to implement thermal protection. In other embodiments, electronic device 100 heats battery 142 when the temperature detected by temperature sensor 180J is below another threshold. In other embodiments, the electronic device 100 may boost the output voltage of the battery 142 when the temperature is below a further threshold.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, and the heart rate detection function is realized.
The keys 190 may include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or more SIM card interfaces. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software structure of the electronic device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, from top to bottom, an application Layer, an application framework Layer, an Android Runtime (ART) and native C/C + + libraries, a Hardware Abstraction Layer (HAL), and a kernel Layer.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, resource manager, notification manager, activity manager, input manager, and the like.
The Window Manager provides a Window Management Service (WMS), which may be used for Window management, Window animation management, surface management, and a relay station as an input system.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The campaign Manager may provide a campaign Manager Service (AMS), which may be used for the start-up, switching, scheduling of system components (e.g., campaigns, services, content providers, broadcast receivers), and the management and scheduling of application processes.
The Input Manager may provide an Input Manager Service (IMS) that may be used to manage inputs to the system, such as touch screen inputs, key inputs, sensor inputs, and the like. The IMS takes the event from the input device node and assigns the event to the appropriate window by interacting with the WMS.
The android runtime comprises a core library and an android runtime. Android runtime is responsible for converting source code into machine code. Android runtime mainly includes adopting Advanced (AOT) compilation technology and Just In Time (JIT) compilation technology.
The core library is mainly used for providing basic functions of the Java class library, such as basic data structure, mathematics, IO, tool, database, network and the like. The core library provides an API for android application development of users. .
The native C/C + + library may include a plurality of functional modules. For example: surface manager (surface manager), Media Framework (Media Framework), libc, OpenGL ES, SQLite, Webkit, etc.
Wherein the surface manager is used for managing the display subsystem and providing the fusion of the 2D and 3D layers for a plurality of application programs. The media framework supports playback and recording of a variety of commonly used audio and video formats, as well as still image files, and the like. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like. OpenGL ES provides for the rendering and manipulation of 2D graphics and 3D graphics in applications. SQLite provides a lightweight relational database for applications of electronic device 100.
The hardware abstraction layer runs in a user space (user space), encapsulates the kernel layer driver, and provides a calling interface for an upper layer.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes an exemplary workflow of software and hardware of the electronic device 100 in conjunction with capturing a scene of a front camera shot.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation, and taking a control corresponding to the click operation as a control of a camera application icon as an example, the camera application calls an interface of an application framework layer, starts the camera application, further starts a camera drive by calling a kernel layer, and captures a still image or a video through the camera 193.
With the increase of electronic devices of users, data sharing among the electronic devices has become more frequent. For example, a user may project a video on a mobile phone to a television or project an audio to a speaker, which may result in different user experiences.
Before proceeding with the explanation of the embodiments, terms related to screen projection will be introduced.
Source end and Sink end: for example, projecting resources on a mobile phone to a television involves two electronic devices, one being a mobile phone and the other being a television. In this process, the originating end of the projection (the mobile phone) may be referred to as the Source end, and the tv set may be referred to as the Sink end.
Homologous screen projection: the interface of the Source end and the interface of the Sink end are in the same screen projection mode.
Different source screen projection: and the Source end interface and the Sink end interface are different in screen projection mode.
Mirror image projection: a homologous projection mode projects a screen at a Source end to a Sink end like a mirror. The specific mode can include that the Source terminal records the screen, then the recording result is sent to the Sink terminal in real time, and then the recording result is played at the Sink terminal.
And (3) resource projection: a heterogeneous Source projection mode is characterized in that a Source end (for example, a mobile phone) transmits local resources (audio, video, pictures and other media resources) to a Sink end in a file or stream mode, and the local resources are played through a Sink player, and the method can be divided into online resource projection and local resource projection. Specifically, the online resource projection is that a Source end sends a URL of the online resource to a Sink end, and the Sink end acquires the content of the media resource according to the direction of the URL and plays the content in a Sink section; the local resource projection is that the Source end sends the local resource to the Sink end, and the local resource is played by the Sink end, and in the projection process, the Source end can control and adjust the Sink end, such as play/pause, volume adjustment, brightness adjustment, language adjustment and other broadcast controls: the broadcast control means that the play state of the Sink end is controlled through a Source end (for example, a mobile phone) in a resource projection mode.
Reverse control: in a mirror image screen projection scene, a Source end (for example, a mobile phone) screen is projected to a Sink end, and the projected Source end (for example, the mobile phone) screen can be directly operated at the Sink end through a mouse, a keyboard or a touch screen, so that the effect of operating the Source end is achieved.
Local area network connection: one connection mode of screen projection requires that a Source terminal and a Sink terminal are connected to the same local area network, and the local area network can be a wired local area network or a wireless local area network.
P2P (Peer-To-Peer) ligation: the method can also be called as equipment direct connection, and WIFI (Wireless Fidelity) direct connection data exchange is directly established between Source and Sink without passing through a connection mode of a router.
Commonly used screen-casting protocols include Miracast, DLNA, Cast +, AirPlay, Chromecast, etc.
It should be understood that the screen projection method described in the present application can be applied to large screen projection (for example, a mobile phone projects to a television, a projector, a set-top box), PC (personal computer) collaboration (for example, after the mobile phone discovers a device through a near field communication manner, a P2P connection is established with a PC, then a mobile phone screen is projected in a mirror image manner, and reverse control is supported, that is, the mobile phone is controlled through the PC), pad (portable android device) collaboration, vehicle screen projection, and the like.
Fig. 3 is an exemplary system diagram of some embodiments. The system 30 may include an electronic device 100, an electronic device 102, an electronic device 103, and an electronic device 104. It should be understood that in some embodiments, system 30 may also include more or fewer electronic devices. It should be understood that in some embodiments, the electronic device 104 may be a wireless router, or a customer premises equipment cpe (customer Premise equipment) that can provide wireless network access, or any other wireless access point wap (wireless access point) or wired network access point. In some embodiments, the electronic devices 100, 102, 103 may be connected to the same local area network (wired or wireless) through the electronic device 104.
The screen projection method provided by the embodiment of the application can be applied to electronic devices such as a mobile phone including a camera, a tablet personal computer, a wearable device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the application does not limit the specific type of the electronic devices at all.
Fig. 4A, 4B, 4C, and 5 illustrate exemplary graphical user interfaces in which a user manually clicks a screen projection control, thereby triggering a screen projection.
The electronic device 100 displays a Graphical User Interface (GUI) on a screen, wherein the GUI includes a screen projection control. It should be understood that the graphical user interface may be an application interface of an application (e.g., Huaqi as shown in FIG. 4B or Huaqi as shown in FIG. 4C as a video), or may be a drop-down menu bar of the electronic device 100 as shown in FIG. 4A (it should be understood that the term "drop-down menu bar" is used herein for illustrative purposes and does not represent a specific limitation on the name of the graphical user page). Illustratively, in fig. 4C, the GUI4002 displays a video 402 (the video may be in a pause playback state or a play playback state). It should be understood that the content displayed on the GUI4002 may also include controls 404 (e.g., volume adjustment controls, play/pause controls, fast forward controls, fast reverse controls), video titles 406, video listings 408, and the like. It should be understood that the content on the GUI4002 is illustrative.
The electronic device 100 detects that the screen-casting control is activated (e.g., the user clicks the screen-casting control), and the electronic device 100 detects that the local device is in the network, and searches the local network for a screen-casting electronic device in the local network. It should be appreciated that the screen-casting controls can be different on different graphical user interfaces (e.g., the screen-casting controls can be a "multi-screen interaction" control 412 as shown in FIG. 4B, a "wireless screen casting" as shown in FIG. 4A, or a control 406 as shown in FIG. 4C). Illustratively, the electronic device 100 is in the same local area network as the electronic devices 102 and 103, and the electronic device 100 may search for the electronic devices 102 and 103 in the same local area network, and obtain device information of the electronic devices 102 and 103 to be displayed on the electronic device 100 (for example, in the GUI4004 in fig. 4B, the device information of the electronic devices 102 and 103 in the local area network is displayed in a list on the electronic device 100), where the device information may include names of the devices.
When the electronic device 100 detects that the user selects the electronic device 102, the electronic device 100 establishes a connection with the electronic device 102, and the electronic device 100 transmits content (at least one of video, text, or pictures) displayed locally to the electronic device 102 for display, or the electronic device 100 transmits audio played locally to the electronic device 102 for playing.
In some embodiments, the electronic device 102 may receive a connection request from the electronic device 100 before receiving content that is displayed locally by the electronic device 100, and the electronic device 102 may pop up to determine whether the connection is approved.
Optionally, after the electronic device 100 establishes a connection with the electronic device 102, the electronic device 102 may display only a part of the content displayed by the electronic device 100 (for example, the screen projection mode may be a resource projection mode). Illustratively, as shown in fig. 5, the image user interface displayed by the electronic device 100 includes a video 402 (the video may be in a paused playing state or in a playing state), and after the electronic device 100 establishes a connection with the electronic device 102, the electronic device 102 may display only the video 402 (the video may be in the paused playing state or in the playing state). Further, after the electronic device 100 establishes the connection with the electronic device 102, the electronic device 102 may only display the video 402, and the electronic device 100 stops displaying the video 402. Further, after the electronic device 100 stops displaying the video 402, the user may control the video 402 played on the electronic device 102 through the electronic device 100.
Optionally, after the electronic device 100 transmits the locally played audio to the electronic device 102 for playing, the electronic device 100 may stop playing the audio. Further, after the electronic device 100 may stop playing the audio, the user may control the audio played by the electronic device 102 on the electronic device 100.
Optionally, the electronic device 100 may transmit a locally displayed graphical user interface image to the electronic device 102, where the graphical user interface displayed by the electronic device 102 is the same as the electronic device 100 (for example, the screen projection mode may be a mirror projection mode). Further, after the user makes an instruction to switch the page on the electronic device 100, the electronic device 100 switches to display a second graphical user interface, and the electronic device 102 receives data sent by the electronic device 100 and switches to display a third graphical user interface, where the second graphical user interface is the same as the third graphical user interface. For example, the electronic device 100 switches to display the native desktop, and the electronic device 102 also switches to display the desktop of the electronic device 100.
However, the screen is triggered by clicking the control, inconvenience is brought to the user when other electronic devices are manually selected for screen projection, and the user needs to perform multi-step operations to share the content on the electronic device 100 with the other electronic devices. For example, the user needs to first click the screen-projection icon and then needs to search the list of screen-projectable electronic devices for the electronic device to be projected. One problem that is annoying to the user is that, due to the large number of screen-projectable electronic devices, it is difficult for the user to find the device that wants to project the screen in the list of screen-projectable electronic devices, because it is difficult to distinguish the different electronic devices by name alone, in other words, when there are a large number of similar devices, the user may not know which device name needs to be projected.
Secondly, for some screen shots on the application program application interface, because additional button icons need to be arranged on the application interface, the layout of the content is often compact in a limited screen space, the screen shots arranged in the limited space are difficult to find by a user, which can cause burden to the user, and in addition, too many icons arranged on the application program application interface are not suitable for an immersion type page, which affects the user experience.
Fig. 6 is a schematic diagram illustrating the electronic device 100 and another electronic device connected through a near field to trigger the screen projection. Electronic device 100 may establish a near field connection with a near field tag of electronic device 102 (e.g., electronic device 102 is a television, and the near field tag is located within a remote control) to initiate the screen projection.
After the electronic device 100 establishes a connection with the electronic device 102, the electronic device 100 transmits the content (video, text, picture, etc.) displayed locally or the audio played locally to the electronic device 102. In some embodiments, electronic device 102 may receive a connection request from electronic device 100 before receiving the content, and electronic device 102 may pop-up a window to request whether the connection is approved.
After the electronic device 100 detects the near-field connection and transmits the content (video, audio, text, picture, etc.) displayed locally to the electronic device 102, the display modes of the electronic device 100 and the electronic device 102 receiving the screen-projected content may refer to the content in the embodiments illustrated in fig. 4A, 4B, 4C, and 5.
However, near field interaction is more suitable for short-distance application scenes, such as a mobile phone screen-projecting notebook computer, but is not very suitable for a scene when a mobile phone interacts with an electronic device such as a television and a sound box at a long distance. For example, to screen a television, a remote control for the television needs to be found or a person walks in front of the television to establish a near field connection. Similarly, to synchronize the audio on the handset to the speaker for playback, the user is required to walk in front of the speaker to establish a near-field connection.
Fig. 7A-7C are a set of schematic diagrams illustrating the electronic device 100 capturing an image through a camera (e.g., a front camera or a rear camera), performing image recognition on the captured image, and triggering screen projection after recognizing a preset electronic device.
Image Recognition (Image Recognition) is a technique of analyzing and understanding an Image with a computer based on deep learning and big data to recognize various different patterns of objects and subjects. Based on the deep learning technology, the visual content in the image can be accurately identified, various objects, scenes and concept labels can be provided, the capabilities of target detection, attribute identification and the like are achieved, the image content is accurately identified and understood, an intelligent system is constructed, and the service efficiency is improved.
It should be understood that the preset electronic device described herein may be an electronic device in an electronic device mapping table that is pre-established by a user. In some embodiments, the preset electronic device may also be an electronic device in an electronic device mapping table preset when the electronic device 100 is shipped from a factory. The electronic device mapping table may also be generated by the electronic device 100 or the server according to the screen-projection use habit statistics of the user.
When the user realizes content sharing by screen projection, the problem that whether privacy is leaked or not is considered. For example, a user may wish to share content by screen projection at home or in an office, but in a mall, a station, or other public places, the user does not want to share content by screen projection.
Therefore, a user can customize the electronic devices that the user wants to project, for example, the user can establish an electronic device mapping table for the electronic devices such as a television a, a computer B, a stereo C, and a computer D in an office, and the electronic devices in the electronic device mapping table are all preset electronic devices.
Illustratively, the user may establish the mapping table of the electronic device by taking a picture of the electronic device. The user can take pictures of different electronic devices and store the taken pictures in the computer. In some embodiments, the user can also input the model of the electronic device, download the picture of the electronic device from the server, and establish the mapping table of the electronic device. In some other embodiments, the mapping table of the electronic device may be obtained by training a neural network model.
When the electronic device 100 turns on a camera (a front camera or a rear camera), the camera is directed to an object, and the electronic device 100 recognizes, through an image recognition technology, that the object belongs to a preset electronic device in an electronic device mapping table pre-established by a user, then the electronic device 100 starts further verification (e.g., detects whether the electronic device 100 and the recognized electronic device are in the same local area network). It should be noted that when the camera of the electronic device 100 faces an object, pictures can be continuously captured without the user taking a picture of the object. At this time, the electronic device 100 may extract one or more image frames containing the object from the buffer, and perform image recognition using the image frames.
When the electronic device 100 does not recognize the electronic device through the image collected by the recognition camera, or the recognized electronic device does not belong to the preset electronic device, the electronic device 100 does not start the verification. The electronic device 100 is taken as an example for illustrative purposes. The electronic device 100 may employ the local camera to capture images of objects in the environment (including electronic devices, tables, chairs, faces, etc.).
Step 1: the electronic device 100 captures an image through a camera.
In some embodiments, the electronic device 100 may employ a local front-facing camera to continuously (e.g., periodically) capture images. Optionally, the front-facing camera may be an ultra-low power consumption camera (the camera operates in a low power consumption mode) carried by the electronic device 100, where the ultra-low power consumption means that its power consumption is far lower than that of a conventional front-facing camera and a rear-facing camera (such as a camera with 100 ten thousand pixels, 500 ten thousand pixels, or 1000 ten thousand pixels) on an existing electronic device (for example, a mobile phone), and an image frame may be acquired at a very low resolution at ordinary times, so as to save power consumption. Because the resolution ratio of the ultra-low power consumption camera is very low, the privacy of the user cannot be influenced. It should be appreciated that the ultra-low power camera may be switched between a low power mode (e.g., low resolution captured images) and a normal mode (e.g., high resolution captured images).
As shown in the hardware structure diagram of the electronic device 100 in fig. 1, the processor 110 of the electronic device 100 may include a first processor 1101 and a second processor 1102. The first processor 1101 may be an auxiliary computing chip, which may be a coprocessor or an auxiliary processor, and is configured to reduce a load on an application processor of the electronic device 100, and perform a predetermined processing task, such as processing image or video data, or sensing and measuring motion data, so as to reduce a load on a main processor and further prolong a endurance time. The coprocessor in the embodiment of the invention can receive and identify the low-power-consumption mode image at any time with lower power consumption. It is understood that this is merely an exemplary description, and that, according to the specific type of the electronic device 100, the first processor 1101 and the second processor 1102 may be both ARM architecture processors or both X86 architecture processors, or other architecture processors, and further, the first processor 1101 and the second processor 1102 may also be processing units with different performance and functions integrated on the same processor component, that is, the combination of different kinds of processors does not specifically limit the embodiment of the present invention. The second processor 1102 may be an application processor (which may be integrated with an image information processor or include an image information processing unit), and is used as a main processor of the electronic device 100 to be responsible for displaying the screen 193, playing audio and video, various applications, voice call, data transmission, and the like, and when the user presses the power key to start the terminal, the second processor 1102 starts to operate, so that the electronic device 100 can be normally used; when the user turns off the power key, in order to reduce the power consumption of the second processor 1102, the second processor 1102 enters a sleep state, and in the embodiment of the present invention, only the first processor 1101 with lower power consumption can normally operate.
The screen projection method can be applied to the electronic device 100 comprising a first processor, a second processor and a front camera 1931, wherein the first processor 1101 and the second processor 1102 are respectively connected with the front camera 1931 through a front camera interface, and the first processor 1101 and the second processor 1102 and the screen 193 are electrically connected with each other.
Optionally, the second processor may send a first message to the first processor when detecting that the front-facing camera of the electronic device 100 is in the low power consumption mode.
It should be understood that the low power consumption mode described in this embodiment of the application refers to a mode in which the front-facing camera is not in a mode in which a normal photographing or shooting function is not turned on, that is, the user does not enter the normal shooting mode of the front-facing camera through the photographing-related application at this time. The non-low power consumption mode may be a mode in which the user starts a normal shooting function of the front-facing camera through a related shooting application, or the front-facing camera performs image acquisition in a high resolution mode. It can be understood that when the electronic device 100 is turned on from the power-off state, it is obvious that the front camera is still in the low power consumption mode (because the front camera is necessarily turned on first, and then the normal shooting function of the front camera may be turned on).
Therefore, when the second processor detects that the front camera is in a low power consumption mode when the power-off state is detected to the power-on state; or, when the second processing detects that the front-facing camera exits from the non-low power consumption mode (normal operating mode) to the low power consumption mode (normal operating mode is not started), the second processor sends the first message to the first processor to indicate that the front-facing camera is in the low power consumption mode. It will also be appreciated that the first processor may be booted with the second processor when powered off to a powered on state, or the second processor may be booted when sending the first message to the first processor.
Alternatively, the second processor may send the first message to the first processor when the electronic device 100 displays an application interface of some application (e.g., a browser application, a video call application, a game application, etc.). For example, the electronic device 100 detects that the locally displayed application interface is a video chat interface of an instant messaging application, and the second processor may send a first message to the first processor.
Optionally, when the electronic device 100 plays a multimedia file (e.g., video, audio, picture, etc.), and the front camera of the electronic device 100 is in the low power consumption mode, the second processor may send the first message to the first processor.
Optionally, the electronic device 100 is in an operating state, and the second processor may send the first message to the first processor.
And after receiving the first message, the first processor controls the front-facing camera to continuously acquire images.
It should be understood that the second processor may periodically send a first message to the first processor, controlling the front-facing camera to continuously acquire images.
For example, in fig. 7A, the electronic device 100 displays an application interface with a video being played, and the electronic device 100 plays the video as in the GUI4002 in fig. 4. The second processor may send a first message to the first processor. And after receiving the first message, the first processor controls the front-facing camera to continuously acquire images. The user directs the front camera of the electronic device 100 towards the electronic device 102, and the images captured by the front camera include the electronic device 102 images.
Optionally, the front-facing camera may acquire images at a frame rate (e.g., no greater than 10 frames per second (fps)). The front camera collects images at a low frame rate, power consumption can be reduced, and the coprocessor receives the images in the low power consumption mode and aims to judge whether preset electronic equipment is contained in the currently collected images according to the images collected in the low power consumption mode, so that the situation that whether preset characteristic information exists or not is judged without a need of a clearer image or an image with a higher frame rate, and the situation that system power consumption is wasted due to wasted image resources can be avoided. When the front-facing camera is an ultra-low power consumption camera (the camera works in a low power consumption mode), in an image frame acquired by the ultra-low power consumption camera, when the electronic device 100 detects an object similar to the electronic device, the electronic device 100 can start a high resolution mode of the ultra-low power consumption camera to acquire an image, and can also start a conventional camera to acquire the high resolution image, so that the electronic device can conveniently perform image recognition and analysis.
In some embodiments, the electronic device 100 may only carry conventional front and rear cameras. The electronic device 100 may determine the posture of the electronic device 100 through a gyroscope and/or an accelerometer in the electronic device. When it is detected that the posture of the electronic apparatus 100 assumes a specific posture, such as the electronic apparatus 100 assumes a nearly vertical posture (such as fig. 7A), it is presumed that the electronic apparatus is likely to be on-screen. At this time, the electronic apparatus 100 determines whether to turn on the front camera or the rear camera according to the posture of the own apparatus. Alternatively, the electronic device 100 may capture image frames using a front-facing camera.
In some embodiments, the electronic device 100 may also display an application interface of a camera application, detect a predefined gesture input by a user (e.g., the user activates a "take" control to take a picture), and take an image through a camera (front camera or rear camera).
In some embodiments, the electronic device 100 can provide controls on the settings page to select to turn on or off the wisdom screen projection functionality. As shown in fig. 10, the user may select click control 711 to turn on or off the smart screen projection function of the electronic device. It should be understood that the "smart screen projection" is a schematic name for the screen projection method in the embodiment described in fig. 7 to 9 in the present application. Further, the electronic device 100 may also provide different ways of image acquisition. Illustratively, the user may click on control 712 of FIG. 10 to enter the image capture mode setup page, as shown in FIG. 11. In fig. 11, the user can turn on or off the low power consumption camera image capturing mode by clicking the control 811, and can also turn on or off the camera application photographing mode by clicking the control 812. It should be understood that the elements on the graphical user interfaces of fig. 10 and 11 are illustrative, and in other embodiments, the graphical user interfaces of fig. 10 and 11 may contain more or fewer elements.
Step 2: electronic device 100 performs image recognition through recognition algorithm
After the electronic device 100 captures an image through a camera (e.g., an image captured by a front camera or an image captured after a user turns on a camera application), image recognition may be performed through a recognition algorithm.
Optionally, the electronic device 100 may rapidly process the image captured by the camera through a local-neural-network (NN) computing processor, so as to perform image recognition.
Optionally, in some embodiments, the electronic device 100 may upload the captured image to a server for image recognition, where the server may recognize the image through a neural-network (NN) computing processor.
Optionally, the user opens the camera application of the electronic device 100, and after the electronic device 102 in the environment captures an image, the electronic device 100 may issue a request to determine whether to perform image recognition.
As the user does not necessarily turn on the camera application to take a picture for the purpose of zooming in. By setting the step of 'the electronic device 100 can send a request to request whether to perform image recognition', whether to perform screen projection can be judged according to user requirements, so that user experience can be improved, and user confidence is increased. Optionally, the user may set a control (for example, the control 812 shown in fig. 11) in the electronic device setting 100, and when the control is in an open state, the electronic device 100 detects that the user opens the camera application to take a picture, and automatically performs image recognition.
And step 3: the electronic device 100 determines whether the electronic device in the captured image is a preset electronic device according to the image recognition result. For example, if the electronic device is recognized, the recognition result may include device information (including a device name, a model number, etc.) of the recognized electronic device.
In some embodiments, the electronic device 100 may first determine whether the recognition result is an electronic device according to the image recognition result, and if so, further determine whether the electronic device in the recognition result is a preset electronic device.
Electronic device 100 may recognize that the captured image does not contain an electronic device, and may prompt "electronic device not recognized";
after the electronic device 100 identifies the electronic device in the image, the electronic device may be matched with a preset electronic device mapping table in the database. It should be understood that the preset mapping relationship of the electronic device in the database may be preset in advance, or may be generated by the electronic device 100 or the server according to the screen-projection use habit of the user. It should be understood that the database may contain images of the preset electronic devices and device information thereof (e.g., names, models, etc. of the electronic devices).
Optionally, the electronic device 100 may upload the image to a server, match the image with a database in the server, and the electronic device 100 receives a matching result from the server.
Optionally, the electronic device 100 may also match the image with a database in the local computer, and determine whether the electronic device in the image is a preset electronic device.
After the preset electronic device is found in the preset electronic device mapping table through database matching, the device information (e.g., the name of the electronic device) of the preset electronic device can be obtained.
Through database matching, after the preset electronic equipment is not found in the preset electronic equipment mapping table, whether screen-casting sharing operation is started or not can be requested, the option is given to a user, the situation that screen casting cannot be started due to image recognition errors is avoided, the reliability of the user can be enhanced, and the user experience is improved.
For example, in fig. 7A, when the electronic device 100 is playing a video, a user directs the front camera of the electronic device 100 to the electronic device 102, and after the front camera of the electronic device 100 acquires an image of the electronic device 102, the image is identified by an identification algorithm, so as to determine whether the electronic device 102 is a preset electronic device.
In some other embodiments, the electronic device 100 may identify whether the object in the captured image is a preset electronic device. For example, the electronic device 100 recognizes the object in the image as a preset electronic device through image recognition, and outputs a recognition result, which may include device information (including a device name, a model, etc.) of the recognized electronic device. And the object in the image does not need to be firstly judged whether to be the electronic equipment or not and then matched with the database.
The electronic device 100 determines whether the object in the acquired image is the preset electronic device through image recognition without matching with a database, and further determines whether to trigger the next screen projection operation, so that the recognition efficiency can be improved.
And step 3: the electronic device 100 searches for the preset electronic device.
The electronic device 100 determines that the electronic device in the captured image is a preset electronic device according to the image recognition result.
In some embodiments, the electronic device 100 may find the electronic device that can accept screen projection through the transmission protocol, and search the preset electronic device in the electronic device that can accept screen projection. The transmission protocol includes, but is not limited to, Alljoyn protocol, DLNA protocol (Digital Living network alliance), Airplay protocol, miracst protocol, HTTP (Hypertext transfer protocol, ChromeCast, Cast + protocol, etc.
Alternatively, the electronic device 100 may discover acceptable screen-projection electronic devices through a broadcast of the local area network. For example, in some embodiments, the electronic device 100 may be referred to as a Source terminal and the electronic device that is acceptable for projection may be referred to as a Sink terminal. After the Sink device is started, the device information of the Sink device is registered in the soft bus, after the Source device starts device search, the broadcast message is sent to the local area network through a fixed port (e.g., 5684), after the device in the local area network receives the broadcast message, if the requirement is met, the device replies the message to the Source device through unicast, and after the Source device receives the message, the message is analyzed and the Sink electronic device is displayed on an interface of the Source device.
Alternatively, the electronic device 100 may discover acceptable screen projection electronic devices via bluetooth. For example, device discovery may be accomplished by accepting listening by the electronic device that is being screened, sending a broadcast by the electronic device 100, and accepting a reply message from the electronic device that is being screened.
Optionally, the electronic device 100 may discover an electronic device that is acceptable for screen projection through wifi (wireless fidelity) direct connection.
Optionally, the electronic device 100 may dynamically select a wifi (wireless fidelity) direct connection, a bluetooth or a local area network to discover the electronic device that is acceptable for screen projection.
The electronic device 100 searches the list of electronic devices that can accept screen projection for a predetermined electronic device.
Alternatively, the electronic device 100 may search for a preset electronic device in the list of electronic devices that can accept screen projection through electronic device information (e.g., electronic device name).
If the preset electronic device is found by searching the electronic device 100, the electronic device 100 may establish a connection with the preset electronic device; if the electronic device 100 fails to search for the preset electronic device, the user may be prompted to find a failure.
And 4, step 4: the electronic device 100 establishes a connection with a preset electronic device.
Optionally, the electronic device 100 may establish a connection with a preset electronic device through a local area network.
Alternatively, the electronic device 100 may establish a connection with a preset electronic device through P2P.
By establishing a P2P link and performing screen projection service on a P2P channel, the anti-interference capability can be enhanced, and the screen projection experience is improved;
and 5: electronic device 100 performs data transmission with a predetermined electronic device
After the electronic device 100 establishes a connection with the preset electronic device, data transmission is performed.
Optionally, the electronic device 100 may transmit all contents of a screen displayed locally to a preset electronic device for display (for example, the screen projection mode may be a mirror screen projection mode). Illustratively, as shown in fig. 7A, after the electronic device 100 establishes a connection with the preset electronic device 102, a graphical user interface displayed by the electronic device 102 is the same as a graphical user interface displayed by a display screen of the electronic device 100. Further, after the user makes an instruction to switch pages on the electronic device 100, the electronic device 100 switches pages, and the electronic device 102 also switches the displayed screen.
Optionally, the electronic device 100 may transmit a part of the content (e.g., video, picture) of the locally displayed screen to a preset electronic device (e.g., the screen projection mode is a resource projection mode). Illustratively, the electronic device 100 is playing a video, and the electronic device 100 may only transmit the locally displayed video to the electronic device 102.
Optionally, after the electronic device 100 establishes a connection with the preset electronic device, the electronic device 100 may detect whether the local device is playing video, playing audio, or displaying a picture, and the screen projection mode may be automatically switched between the resource projection mode and the mirror projection mode.
Optionally, after the electronic device 100 establishes a connection with the preset electronic device, the preset electronic device may display a part of content of a graphical user interface displayed by the electronic device 100 (for example, the screen projection mode may be a resource projection mode). Illustratively, the electronic device 100 displays the GUI4002, and after the electronic device 100 establishes a connection with the electronic device 102, the electronic device 102 displays only a video played on the GUI4002 of the electronic device 100, as shown in fig. 7B.
Optionally, after the electronic device 100 is connected to the preset electronic device, the preset electronic device displays part of content (for example, the screen projection mode may be a resource projection mode) or all content (for example, the screen projection mode may be a mirror image screen projection mode) of the graphical user interface of the electronic device 100, and the electronic device 100 may enter a remote controller mode. It should be understood that the remote control mode refers to the user controlling the electronic device 102 via the electronic device 100, as shown in fig. 7C. For example, a user may control the play/pause of a video played by the electronic device 102 via a control (e.g., play/pause) on a virtual remote control of the electronic device 100.
Fig. 8 is an exemplary diagram of the electronic device 100 capturing an image by a camera, identifying the electronic device 102, and projecting audio played by the electronic device 100 to the electronic device 102 for playing.
The process of acquiring the image and identifying the electronic device 102 by the electronic device 100 may refer to the technical solutions of the embodiments shown in fig. 7A to 7C, and the process of establishing the connection between the electronic device 100 and the electronic device 102 may also refer to the technical solutions of the embodiments shown in fig. 7A to 7C, which are not described herein again.
Optionally, after the electronic device 100 establishes a connection with the electronic device 102, the audio may be transmitted to the electronic device 102, and after the electronic device 100 transmits the audio to the electronic device 102, the electronic device 100 may stop playing the audio and the audio continuously to save power consumption, as shown in fig. 8. In some embodiments, electronic device 100 may display audio controls (e.g., fast forward controls, fast reverse controls, play/pause controls, volume adjustment controls) on a screen through which a user may control the play/pause, etc., of audio on electronic device 102 via electronic device 100.
Through using the electronic equipment camera to gather the image and discernment electronic equipment to trigger and throw the screen, when realizing that the user shares the content between different electronic equipment, the user can be directly with shared equipment interaction, has solved the problem that the user can not find shared equipment without knowing the equipment name, makes the content sharing between the equipment of crossing more direct nature. In addition, the problem of sharing in one step in short distance or long distance is solved, and content sharing can be realized without going to shared equipment.
The embodiment shown in fig. 9 is a schematic diagram of the electronic device 100 capturing an image of the electronic device 102 through a camera (e.g., a rear camera) and recognizing, thereby receiving a screen shot by the electronic device 102.
For example, the electronic device 102 is playing a video. The electronic device 100 may capture an image of the electronic device 102 via a camera (e.g., a rear-facing camera) and identify, search for and discover the electronic device 102 and establish a connection, and screen the content of the electronic device 102 to a local display.
It should be understood that the methods for identifying the electronic device 102, searching the electronic device 102, and establishing the connection with the electronic device 102 described herein may refer to the related contents of fig. 7A-7C, and are not described herein again.
It should also be understood that after the content of the electronic device 102 is projected to the electronic device 100, the relevant content of fig. 7A-7C may be referred to in the screen projection manner, and will not be described herein again.
FIG. 12 is a schematic flow chart diagram illustrating a method of screen projection in some embodiments.
1201, playing multimedia content by first electronic equipment, and operating a camera of the first electronic equipment in a low power consumption mode;
illustratively, as shown in fig. 7A, a mobile phone is playing a video, and an ultra-low power consumption camera mounted on the mobile phone operates in a low power consumption mode.
Optionally, the video that the mobile phone is playing the video may be a locally stored video file, or may be a video played online.
Optionally, the camera is a front camera.
Optionally, the camera may periodically acquire images.
Optionally, the frame rate of the images acquired by the camera is not greater than 10 frames.
1202, a camera of the first electronic device collects at least one first image frame in a low power consumption operation mode;
for example, as shown in fig. 7A, the camera of the mobile phone may capture a low resolution image in a low power consumption mode. For example, when a user is holding a cell phone to watch a video, the first image frame captured at this time may include a human face.
1203, when it is detected that the at least one first image frame includes a preset object, operating a camera of the first electronic device in a normal mode, and acquiring at least one second image frame;
for example, as shown in fig. 7A, the mobile phone captures an image of a television through a camera. The mobile phone can perform image recognition in the acquired image, judge whether the acquired image is a preset object (for example, a user can set the preset object as electronic equipment), and start the normal mode to acquire the second image frame when the mobile phone detects that the image contains the preset object.
Optionally, in the normal mode, the camera of the mobile phone may acquire an image at a high frame rate, where the high frame rate is greater than an image sampling frame rate in the low power consumption mode.
Optionally, the normal mode may be a high resolution image capturing mode of a mobile phone camera. The camera of the mobile phone can start a high-resolution mode and acquire at least one second image frame, wherein the resolution of the second image frame is higher than that of the first image frame.
1204, when detecting that the preset object is a preset second electronic device, establishing a connection between the first electronic device and the second electronic device;
illustratively, as shown in fig. 7A, the mobile phone detects that the television in the image is a preset electronic device, and establishes a connection with the television.
Optionally, the mobile phone may recognize that the television in the image is the preset electronic device through an image recognition algorithm.
Optionally, the mobile phone may establish a connection with the television through Miracast, DLNA, Cast +, AirPlay, Chromecast, and other protocols.
Optionally, before the mobile phone establishes a connection with the television, the mobile phone may search for an electronic device that is acceptable for screen projection, and then search for the television in the electronic device that is acceptable for screen projection.
Optionally, before the mobile phone establishes a connection with the television, the mobile phone may search for an electronic device that can accept screen projection, and then search for the television in the electronic device that can accept screen projection through device information (e.g., device name, device identifier, etc.).
Optionally, the mobile phone may be connected to the television through a local area network;
optionally, the mobile phone may be connected to the television set in a WiFi direct connection manner;
optionally, the mobile phone may be connected to the television via bluetooth;
optionally, before the mobile phone is connected with the television, a connection request may be sent to the television;
1205, sending the multimedia file to the second electronic device, so that the second electronic device plays the multimedia content.
For example, as shown in fig. 7A to 7C, after the mobile phone establishes a connection with the television, the mobile phone sends a video played by the mobile phone to the television, and the television plays the video.
Optionally, after the mobile phone is connected with the television, the screen projection mode may be mirror image screen projection. For example, after the mobile phone sends the video played by the mobile phone to the television, the pictures displayed by the mobile phone and the television are the same, and the television plays the video.
Optionally, after the mobile phone is connected with the television, the screen projection mode may be resource screen projection. For example, after the mobile phone is connected with the television, the mobile phone sends the video played by the mobile phone to the television, the television plays the video, the mobile phone does not play the video, the mobile phone is in a remote controller mode, and the user can control the pause of the video played by the television through the mobile phone.
Optionally, after the mobile phone is connected with the television, the screen projection mode can be dynamically switched between resource screen projection and mirror image screen projection.
FIG. 13 is a schematic flow chart diagram illustrating another method of screen projection in some embodiments.
1301, operating a camera of the first electronic device in a low power consumption mode;
the camera of the first electronic device operates in the low power consumption mode, and the contents of 1201 in fig. 12 may be referred to, and are not described herein again.
1302, a camera of the first electronic device collects at least one first image frame in a low power consumption operation mode;
specific contents may refer to those in 1202 in FIG. 12, and are not described herein again.
1303, when it is detected that the at least one first image frame includes a preset object, operating a camera of the first electronic device in a normal mode, and acquiring at least one second image frame;
for details, reference may be made to the contents in embodiment 1203 in fig. 12, which is not described herein again.
1304, when the preset object is detected to be a preset second electronic device, the first electronic device establishes connection with the second electronic device;
for details, reference may be made to the contents of 1204 in the embodiment of fig. 12, which are not described herein again.
1305, receiving, by the first electronic device, a multimedia file sent by the second electronic device, wherein the content of the multimedia file is the multimedia content being played by the second electronic device;
illustratively, the handset may issue a request to the television for transmission of the multimedia content being played by the television. The television can send the multimedia content being played to the mobile phone after receiving the request.
1306, the first electronic device plays the multimedia file.
Exemplarily, after receiving a multimedia file sent by a television, a mobile phone plays the multimedia file;
optionally, the mobile phone may play the multimedia file frame while receiving the multimedia file frame sent by the television;
FIG. 14 is a schematic flow chart diagram of a screen projection system in some embodiments
1401, the camera of the first electronic device is operated in a low power mode;
for details, reference may be made to contents 1201 in the embodiment of fig. 12, and further description is omitted here.
1402, the camera of the first electronic device collects at least one first image frame in a low power consumption operation mode;
for details, reference may be made to 1202 in the embodiment of fig. 12, which is not described herein again.
1403, when it is detected that the at least one first image frame includes a preset object, a camera of the first electronic device operates in a normal mode, and at least one second image frame is acquired;
for details, reference may be made to 1203 in the embodiment of fig. 12, which is not described herein again.
1404, when detecting that the preset object is a preset second electronic device, establishing a connection between the first electronic device and the second electronic device;
for details, reference may be made to 1204 in the embodiment of fig. 12, which is not described herein again.
1405, the second electronic device sending the multimedia content being played to the first electronic device;
for details, reference may be made to the contents 1205 in the embodiment in fig. 12, and further description is omitted here.
1406, the first electronic device plays the multimedia content.
For details, reference may be made to 1206 in the embodiment of fig. 12, and further description is omitted here.
The embodiment of the present invention further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on an electronic device, the electronic device executes the relevant method steps to implement the screen projection method in the above embodiment.
The embodiment of the present invention further provides a computer program product, when the computer program product runs on a computer, the computer is enabled to execute the relevant method steps to implement the screen projection method in the above embodiment.
In addition, the embodiment of the present invention further provides an apparatus, which may specifically be a chip, a component or a module, and the apparatus may include a processor and a memory connected to each other; when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the screen projection method in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in the embodiments of the present invention are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer storage medium, the computer program product, or the chip may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, a module or a unit may be divided into only one logic function, and may be implemented in other ways, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present invention may be essentially or partially contributed to by the prior art, or all or part of the technical solution may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method of the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (27)

1. A method of screen projection, the method comprising:
playing multimedia contents by first electronic equipment, wherein a camera of the first electronic equipment operates in a low power consumption mode;
the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode;
when the at least one first image frame is detected to comprise a preset object, the camera of the first electronic equipment operates in a normal working mode, and at least one second image frame is collected;
when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device, and the multimedia content is sent to the second electronic device, so that the second electronic device plays the multimedia content.
2. The method of claim 1, wherein the camera of the first electronic device comprises:
the front camera.
3. The method of claim 1 or 2, wherein the camera of the first electronic device operates in a normal operating mode comprising:
the camera of the first electronic device acquires image frames in a high-resolution mode.
4. The method of any of claims 1 to 3, wherein the capturing of the at least one first image frame by the camera of the first electronic device in the low power mode of operation comprises:
the frame rate of at least one first image frame collected by the camera in the low-power-consumption operation mode is not more than 10 frames per second.
5. The method according to any one of claims 1-4, wherein the detecting that the preset object is a preset second electronic device comprises:
and carrying out image recognition on the acquired image, and recognizing that the object in the acquired image is a preset second electronic device.
6. The method according to claim 5, wherein the image recognition of the captured image, recognizing that the object in the captured image is a preset second electronic device, further comprises:
and matching the image recognition result with the mapping table of the electronic equipment, and matching the image recognition result with the preset second electronic equipment.
7. The method of claim 6, wherein the electronic device mapping table comprises:
the image of the preset second electronic device and the device information of the preset second electronic device, wherein the device information includes a name of the second electronic device.
8. The method of any of claims 1-7, wherein prior to the camera of the first electronic device acquiring at least one first image frame in the low power mode of operation, the method further comprises:
and the second processor of the local machine sends a first message to the first processor of the local machine to control the camera to acquire images.
9. The method of any of claims 1-8, wherein prior to the first electronic device establishing a connection with the second electronic device, the method further comprises:
the first electronic device searches for and discovers the second electronic device.
10. The method of claim 9, wherein the first electronic device searches for the second electronic device, and wherein the method comprises one or more of the following:
discovering a third electronic device through broadcasting of a local area network, and comparing device information of the second electronic device and the third electronic device, or
Discovering the third electronic device through Bluetooth, and comparing the device information of the second electronic device and the third electronic device, or
And discovering the third electronic equipment through WIFI direct connection, and comparing the equipment information of the second electronic equipment with the equipment information of the third electronic equipment.
11. The method of claims 1-10, wherein establishing the connection between the first electronic device and the second electronic device comprises:
the first electronic device establishes a P2P connection with the second electronic device.
12. The method of any of claims 1-11, wherein the multimedia content comprises:
at least one of video, audio, or pictures.
13. An electronic device supporting screen projection, the electronic device comprising:
the camera comprises a low power consumption mode and a normal working mode;
at least one processor;
a memory storing instructions that, when executed by the at least one processor, the electronic device is to:
playing the multimedia content;
controlling the camera to acquire at least one first image frame in a low-power-consumption operation mode;
when the at least one first image frame is detected to comprise a preset object, controlling the camera to operate in a normal mode, and collecting at least one second image frame;
and when the preset object is detected to be preset equipment, establishing connection with the preset equipment, and sending the multimedia file to the preset equipment so that the preset equipment plays the multimedia content.
14. The electronic device of claim 13, wherein the camera is a front-facing camera of the electronic device.
15. The electronic device according to claim 13 or 14, wherein the electronic device controls the camera to operate in a normal operation mode, comprising the electronic device performing the steps of:
the electronic equipment controls a camera of the electronic equipment to acquire image frames in a high-resolution mode.
16. The electronic device of any of claims 13-15, wherein the electronic device controls the camera to capture at least one first image frame in a low power mode of operation, comprising the electronic device performing the steps of: the electronic equipment controls the frame rate of at least one first image frame collected by the camera in the low-power-consumption operation mode to be not more than 10 frames per second.
17. The electronic device according to any of claims 13-16, wherein the electronic device detecting that the preset object is a preset device comprises the electronic device performing the steps of: and the electronic equipment performs image recognition on the acquired image, and recognizes that the object in the acquired image is preset equipment.
18. The electronic device of claim 17, wherein the electronic device performs image recognition on the captured image, and recognizes that the object in the captured image is a preset electronic device, and the electronic device further performs the following steps:
and the electronic equipment matches the image recognition result with the electronic equipment mapping table and matches the image recognition result with the preset equipment.
19. The electronic device of claim 18, wherein the electronic device mapping table comprises: the method includes the steps of presetting an image of equipment and presetting equipment information of the equipment, wherein the equipment information comprises the name of the preset equipment.
20. The electronic device of claims 13-19, wherein before the first electronic device controls the camera to capture at least one first image frame in the low power mode of operation, the electronic device performs the steps of: the second processor of the electronic device may send a first message to the first processor of the electronic device, and control the camera to capture an image.
21. Electronic device according to claims 13-20, wherein before the first electronic device establishes a connection with the preset device, the electronic device further performs the following steps: and the first electronic equipment searches for and discovers the preset equipment.
22. The electronic device according to any of claims 13-21, wherein establishing a connection with the pre-defined device comprises the electronic device performing the following steps: and the electronic equipment establishes P2P connection with the preset equipment.
23. The electronic device of claims 13-22, wherein the electronic device playing multimedia content comprises: the electronic device plays at least one of video, audio or pictures.
24. A method of screen projection, the method comprising:
a camera of a first electronic device operates in a low power mode;
the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode;
when the at least one first image frame is detected to comprise a preset object, a camera of the first electronic equipment operates in a normal mode, and at least one second image frame is collected;
when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device;
the first electronic equipment receives a multimedia file sent by the second electronic equipment, wherein the content of the multimedia file is the multimedia content played by the second electronic equipment;
and the first electronic equipment plays the multimedia file.
25. A screen projection system is characterized by comprising a first electronic device and a second electronic device;
the camera of the first electronic device operates in a low power mode;
the camera of the first electronic device collects at least one first image frame in a low-power-consumption operation mode;
when the at least one first image frame is detected to comprise a preset object, a camera of the first electronic equipment operates in a normal mode, and at least one second image frame is collected;
when the preset object is detected to be a preset second electronic device, the first electronic device is connected with the second electronic device;
the second electronic equipment sends the multimedia content which is being played to the first electronic equipment;
and the first electronic equipment plays the multimedia content.
26. A computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the method of screen projection of any of claims 1-12 or 24.
27. A computer program product, characterized in that it causes a computer to carry out the method of screen projection according to any one of claims 1-12 or 24, when the computer program product is run on the computer.
CN202110061435.7A 2021-01-18 2021-01-18 Screen projection method and electronic equipment Pending CN114860178A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110061435.7A CN114860178A (en) 2021-01-18 2021-01-18 Screen projection method and electronic equipment
PCT/CN2022/071643 WO2022152174A1 (en) 2021-01-18 2022-01-12 Screen projection method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110061435.7A CN114860178A (en) 2021-01-18 2021-01-18 Screen projection method and electronic equipment

Publications (1)

Publication Number Publication Date
CN114860178A true CN114860178A (en) 2022-08-05

Family

ID=82446935

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110061435.7A Pending CN114860178A (en) 2021-01-18 2021-01-18 Screen projection method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114860178A (en)
WO (1) WO2022152174A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048245A (en) * 2022-08-09 2023-05-02 荣耀终端有限公司 Control method and device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5766625B2 (en) * 2012-01-16 2015-08-19 株式会社東芝 Camera device
JP6304145B2 (en) * 2015-06-30 2018-04-04 京セラドキュメントソリューションズ株式会社 Information processing apparatus and image forming apparatus setting condition designation method
CN110517034A (en) * 2018-05-22 2019-11-29 维沃移动通信有限公司 A kind of object identifying method and mobile terminal
JP2020008750A (en) * 2018-07-10 2020-01-16 セイコーエプソン株式会社 Display unit and display processing method
CN110109636B (en) * 2019-04-28 2022-04-05 华为技术有限公司 Screen projection method, electronic device and system
CN111901896A (en) * 2020-07-14 2020-11-06 维沃移动通信有限公司 Information sharing method, information sharing device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116048245A (en) * 2022-08-09 2023-05-02 荣耀终端有限公司 Control method and device

Also Published As

Publication number Publication date
WO2022152174A1 (en) 2022-07-21
WO2022152174A9 (en) 2022-10-20

Similar Documents

Publication Publication Date Title
CN110381197B (en) Method, device and system for processing audio data in many-to-one screen projection
CN112130742B (en) Full screen display method and device of mobile terminal
CN113542839B (en) Screen projection method of electronic equipment and electronic equipment
WO2021000807A1 (en) Processing method and apparatus for waiting scenario in application
CN113691842B (en) Cross-device content projection method and electronic device
CN115599566A (en) Notification message processing method, device, system and computer readable storage medium
CN113923230B (en) Data synchronization method, electronic device, and computer-readable storage medium
CN112492193B (en) Method and equipment for processing callback stream
WO2020029306A1 (en) Image capture method and electronic device
CN112119641B (en) Method and device for realizing automatic translation through multiple TWS (time and frequency) earphones connected in forwarding mode
WO2021052139A1 (en) Gesture input method and electronic device
WO2022007707A1 (en) Home device control method, terminal device, and computer-readable storage medium
CN114356195B (en) File transmission method and related equipment
CN114222020B (en) Position relation identification method and device and readable storage medium
CN112532508B (en) Video communication method and video communication device
CN114756785A (en) Page display method and device, electronic equipment and readable storage medium
WO2022152174A9 (en) Screen projection method and electronic device
CN113950045B (en) Subscription data downloading method and electronic equipment
CN115567630A (en) Management method of electronic equipment, electronic equipment and readable storage medium
CN115145519A (en) Display method, electronic equipment and system
CN114827098A (en) Method and device for close shooting, electronic equipment and readable storage medium
CN114489876A (en) Text input method, electronic equipment and system
CN114567871A (en) File sharing method and device, electronic equipment and readable storage medium
CN113676902A (en) System and method for providing wireless internet access and electronic equipment
WO2022222702A1 (en) Screen unlocking method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination