WO2022213937A1 - 可穿戴设备的控制方法及电子设备 - Google Patents
可穿戴设备的控制方法及电子设备 Download PDFInfo
- Publication number
- WO2022213937A1 WO2022213937A1 PCT/CN2022/085118 CN2022085118W WO2022213937A1 WO 2022213937 A1 WO2022213937 A1 WO 2022213937A1 CN 2022085118 W CN2022085118 W CN 2022085118W WO 2022213937 A1 WO2022213937 A1 WO 2022213937A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- diopter
- wearable device
- state
- optical lens
- lens group
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 230000003287 optical effect Effects 0.000 claims abstract description 94
- 230000015654 memory Effects 0.000 claims description 40
- 238000004590 computer program Methods 0.000 claims description 20
- 230000006870 function Effects 0.000 description 25
- 238000007726 management method Methods 0.000 description 21
- 238000012545 processing Methods 0.000 description 21
- 238000004891 communication Methods 0.000 description 20
- 238000010586 diagram Methods 0.000 description 14
- 238000010295 mobile communication Methods 0.000 description 11
- 230000037072 sun protection Effects 0.000 description 11
- 230000009466 transformation Effects 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 230000010287 polarization Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000011084 recovery Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012634 optical imaging Methods 0.000 description 2
- 239000002096 quantum dot Substances 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 206010042496 Sunburn Diseases 0.000 description 1
- 206010047513 Vision blurred Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/08—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted to co-operate with a remote control mechanism
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B25/00—Eyepieces; Magnifying glasses
- G02B25/001—Eyepieces
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/09—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted for automatic focusing or varying magnification
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/163—Wearable computers, e.g. on a belt
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
Definitions
- the embodiments of the present application relate to the field of wearable devices, and in particular, to a control method of a wearable device and an electronic device.
- the optical imaging module in the Virtual Reality (VR) helmet device is generally composed of a display screen and an optical lens group. As shown in Figure 1, when the device is in normal use, the light emitted by the display screen (ie the screen) passes through the optical lens. After the group is refracted, it enters the human eye, so that the user can see the picture displayed on the screen in the VR headset.
- the optical lenses in the device are exposed to the environment.
- the optical lens group in this type of equipment has the effect of converging light
- part of the ambient light will enter the optical imaging module through the human eye side of the optical lens group, and then focus on the display screen after passing through the lens group.
- resulting in a large amount of energy at the focus point of the light on the screen which in turn causes damage to the screen, affects the display effect of the virtual reality helmet device, and may even cause damage to the screen, making it impossible to display the picture.
- a method adopted in the prior art is to add a polarizing device.
- the polarizing device is added using the polarization principle, and the polarization direction of the polarizing device is consistent with the polarization of the screen light in the virtual reality helmet device.
- the polarizing device can only transmit light in a single polarization direction, the ambient light entering the virtual reality helmet device can be weakened, and the polarized light emitted by the screen can be passed through, so as to reduce the focus on the screen without affecting the display effect.
- the intensity of ambient light thereby preventing sunburn in the virtual reality headset.
- the transmittance of the polarizer to light in the same polarization direction does not exceed 90%. Therefore, the use of polarizers will reduce the display brightness of the VR headset. Moreover, if the direction of the polarizer is inconsistent with the polarization direction of the light on the screen, the display brightness will be further reduced, so this solution requires higher assembly accuracy.
- the present application provides a control method of a wearable device and an electronic device.
- the electronic device can adjust the diopter of the optical lens group to increase the spot size of the ambient light focused on the screen of the wearable device through the lens, thereby dispersing the energy of the light, thereby preventing the screen from being sunburned.
- an embodiment of the present application provides a control method for a wearable device.
- the method includes: acquiring the wearing state of the wearable device; when the wearable device changes from the wearing state to the non-wearing state, adjusting a first diopter of an optical lens group of the wearable device to a second diopter, wherein the second diopter is greater than the second diopter a diopter.
- the electronic device can detect the wearing state of the wearable device, and in the case of detecting that the wearable device is not worn, increase the diopter of the optical lens group of the wearable device to increase the focusing of ambient light through the lens The size of the light spot on the screen of the wearable device, so as to disperse the energy of the light, thereby preventing the screen from being sunburned.
- the electronic device may be a wearable device, or a chip in a wearable device. It can also be an external device, such as a computer, a mobile phone and other devices connected to a wearable device, or a chip of the device.
- an external device such as a computer, a mobile phone and other devices connected to a wearable device, or a chip of the device.
- the embodiments of the present application can implement the sun protection mode in the embodiments of the present application without changing the structure of the existing wearable device.
- the automatic sun protection function is realized by automatically detecting the wearing state and adjusting the diopter accordingly.
- an optical lens group may include one or more lenses.
- the lenses are optionally optical lenses such as spherical lenses, aspherical lenses or Fresnel lenses.
- the lens may be plastic or glass, which is not limited in this application.
- the method further includes: acquiring the wearing state of the wearable device; and adjusting the second diopter of the optical lens group of the wearable device to the first diopter when the wearable device changes from the unworn state to the wearing state.
- the electronic device can detect the wearing state of the wearable device, and when the wearable device returns from the unworn state to the wearing state, the wearable device can automatically restore the diopter of the optical lens group to the value before the transformation. So that when the user wears the wearable device again, the sun protection mode can be automatically released, so that the user can observe a clear picture displayed on the screen.
- the first diopter may be set by the user during use, or may be factory-set by the wearable device.
- the second diopter is the maximum diopter attainable by the optical lens group.
- the electronic device can disperse the light to the screen to the greatest extent, so as to avoid the screen from being sunburned.
- adjusting the first diopter of the optical lens group of the wearable device to the second diopter includes: When the wearable device changes from the wearing state to the non-wearing state, the first state of the focusing module of the wearable device is obtained, wherein the focusing module is used to control the first diopter of the optical lens group, and the focusing module is placed in the first state , the diopter of the optical lens group is the first diopter; the focusing module is adjusted to the second state, wherein when the focusing module is placed in the second state, the diopter of the optical lens group is the second diopter.
- the electronic device can adjust the diopter of the optical lens group by controlling the focusing module.
- the first state of the focusing module is the current state described in the following embodiments.
- the second state of the focusing module is the maximum diopter state described in the following embodiments.
- adjusting the first diopter of the optical lens group of the wearable device to the second diopter includes: When the wearable device changes from the wearing state to the non-wearing state and remains in the non-wearing state for a set period of time, the first diopter of the optical lens group of the wearable device is adjusted to the second diopter.
- the electronic device determines that the wearable device remains in an unworn state within a set period of time before starting the subsequent sun protection mode, thereby avoiding frequent turning on and off of the sun protection mode and reducing the power consumption of the device.
- an embodiment of the present application provides an electronic device.
- the device includes: one or more processors; a memory; and one or more computer programs, wherein the one or more computer programs are stored on the memory and, when executed by the one or more processors, cause the electronic device to execute The following steps: obtain the wearing state of the wearable device; when the wearable device changes from the wearing state to the non-wearing state, adjust the first diopter of the optical lens group of the wearable device to the second diopter, wherein the second diopter is greater than the first diopter Diopter.
- the computer program when executed by one or more processors, it causes the electronic device to perform the following steps: acquiring the wearing state of the wearable device; The second diopter of the optical lens group is adjusted to the first diopter.
- the second diopter is the maximum diopter attainable by the optical lens group.
- the electronic device when the computer program is executed by the one or more processors, the electronic device is caused to perform the following steps: when the wearable device changes from the wearing state to the non-wearing state, obtaining the first state of the focusing module of the wearable device, wherein the focusing module is used to control the first diopter of the optical lens group, and when the focusing module is placed in the first state, the diopter of the optical lens group is the first diopter; The focusing module is adjusted to the second state, wherein when the focusing module is placed in the second state, the diopter of the optical lens group is the second diopter.
- the electronic device when the computer program is executed by the one or more processors, the electronic device is caused to perform the following steps: when the wearable device changes from the wearing state to the non-wearing state, And keep the unworn state for a set period of time, and adjust the first diopter of the optical lens group of the wearable device to the second diopter.
- the second aspect and any implementation manner of the second aspect correspond to the first aspect and any implementation manner of the first aspect, respectively.
- the technical effects corresponding to the second aspect and any implementation manner of the second aspect reference may be made to the technical effects corresponding to the first aspect and any implementation manner of the first aspect, which will not be repeated here.
- embodiments of the present application provide a computer-readable medium for storing a computer program, where the computer program includes instructions for executing the method in the first aspect or any possible implementation manner of the first aspect.
- an embodiment of the present application provides a computer program, where the computer program includes instructions for executing the method in the first aspect or any possible implementation manner of the first aspect.
- an embodiment of the present application provides a chip, where the chip includes a processing circuit and a transceiver pin.
- the transceiver pin and the processing circuit communicate with each other through an internal connection path, and the processing circuit executes the method in the first aspect or any possible implementation manner of the first aspect to control the receiving pin to receive a signal to Control the send pin to send the signal.
- an embodiment of the present application provides a control system for a wearable device, where the system includes the electronic device and the wearable device involved in the second aspect.
- FIG. 1 is a schematic structural diagram of an exemplary wearable device
- FIG. 2 is a schematic structural diagram of an exemplary wearable device
- FIG. 3 is a schematic diagram of a hardware structure of an exemplary electronic device
- FIG. 4 is a schematic diagram of the software structure of the electronic device exemplarily shown
- FIG. 5 is a schematic flowchart of a control method for a wearable device provided by an embodiment of the present application
- FIG. 6 is a schematic structural diagram of an exemplary wearable device
- FIG. 7 is a schematic diagram of the position transformation of an exemplary focusing module
- FIG. 8 is a schematic diagram of the position transformation of an exemplary focusing module
- FIG. 9 is a schematic structural diagram of an apparatus provided by an embodiment of the present application.
- first and second in the description and claims of the embodiments of the present application are used to distinguish different objects, rather than to describe a specific order of the objects.
- first target object, the second target object, etc. are used to distinguish different target objects, rather than to describe a specific order of the target objects.
- words such as “exemplary” or “for example” are used to represent examples, illustrations or illustrations. Any embodiments or designs described in the embodiments of the present application as “exemplary” or “such as” should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as “exemplary” or “such as” is intended to present the related concepts in a specific manner.
- multiple processing units refers to two or more processing units; multiple systems refers to two or more systems.
- FIG. 3 shows a schematic structural diagram of the electronic device 100 .
- the electronic device 100 shown in FIG. 3 is only an example of an electronic device, and the electronic device 100 may have more or less components than those shown in the figure, and two or more components may be combined , or can have a different component configuration.
- the various components shown in Figure 3 may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
- the electronic device 100 shown in FIG. 3 may be the VR glasses in FIG. 1 , or may be the handle 1 and the handle 2 in FIG. 1 , which are not limited in this application.
- the electronic device 100 may include: a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2.
- Mobile communication module 150 wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, electromagnetic module 180, 6-axis IMU module 181, key 190, indicator 191, camera 192, display screen 193, and a subscriber identification module (subscriber identification module, SIM) card interface 194 and the like.
- SIM subscriber identification module
- the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
- application processor application processor, AP
- modem processor graphics processor
- graphics processor graphics processor
- ISP image signal processor
- controller memory
- video codec digital signal processor
- DSP digital signal processor
- NPU neural-network processing unit
- the controller may be the nerve center and command center of the electronic device 100 .
- the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
- a memory may also be provided in the processor 110 for storing instructions and data.
- the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
- the processor 110 may include one or more interfaces.
- the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
- I2C integrated circuit
- I2S integrated circuit built-in audio
- PCM pulse code modulation
- PCM pulse code modulation
- UART universal asynchronous transceiver
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial bus
- the processor 110, the external memory interface 120, and the internal memory 121 may also be provided in an external device.
- wearable devices can be connected to computers, tablets, mobile phones and other devices through cables.
- the processor 110, the external memory interface 120, and the internal memory 121 may be processors and memories in an external device. That is to say, the processor of the external device may implement the steps executed by the processor described in the embodiments of the present application.
- the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
- the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
- the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the electronic device 100 .
- the electronic device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
- the charging management module 140 is used to receive charging input from the charger.
- the charger may be a wireless charger or a wired charger.
- the charging management module 140 may receive charging input from the wired charger through the USB interface 130 .
- the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100 . While the charging management module 140 charges the battery 142 , it can also supply power to the electronic device through the power management module 141 .
- the power management module 141 is used for connecting the battery 142 , the charging management module 140 and the processor 110 .
- the power management module 141 receives input from the battery 142 and/or the charging management module 140 and supplies power to the processor 110 , the internal memory 121 , the external memory, the display screen 194 , the camera 193 , and the wireless communication module 160 .
- the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle times, battery health status (leakage, impedance).
- the power management module 141 may also be provided in the processor 110 .
- the power management module 141 and the charging management module 140 may also be provided in the same device.
- the wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
- Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
- the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
- the mobile communication module 150 may provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 100 .
- the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
- the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
- the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
- at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
- at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
- the modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
- the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
- the modem processor may be a separate device.
- the modem processor may be independent of the processor 110, and may be provided in the same device as the mobile communication module 150 or other functional modules.
- the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
- WLAN wireless local area networks
- BT Bluetooth
- GNSS global navigation satellite system
- FM frequency modulation
- NFC near field communication
- IR infrared technology
- the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
- the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
- the antenna 1 of the electronic device 100 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
- the GNSS may include global positioning system (global positioning system, GPS), global navigation satellite system (global navigation satellite system, GLONASS), Beidou navigation satellite system (beidou navigation satellite system, BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
- global positioning system global positioning system, GPS
- global navigation satellite system global navigation satellite system, GLONASS
- Beidou navigation satellite system beidou navigation satellite system, BDS
- quasi-zenith satellite system quadsi -zenith satellite system, QZSS
- SBAS satellite based augmentation systems
- the electronic device 100 implements a display function through a GPU, a display screen 194, an application processor, and the like.
- the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
- the GPU is used to perform mathematical and geometric calculations for graphics rendering.
- Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
- the display screen 193 is used to display images, videos, and the like.
- the display screen 193 includes a display panel and an optical lens group.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
- LED liquid crystal display
- OLED organic light-emitting diode
- AMOLED organic light-emitting diode
- FLED flexible light-emitting diode
- Miniled MicroLed, Micro-oLed
- quantum dot light-emitting diode quantum dot light emitting diodes, QLED
- the electronic device 100 may include one or N display screens 193 , where N is a positive integer greater than one.
- the optical lens group includes single or multiple optical lenses such as spherical lenses, aspherical lenses or Fresnel lenses.
- the electronic device 100 can realize the shooting function through the ISP, the camera 192, the video codec, the GPU, the display screen 194 and the application processor.
- the ISP is used to process the data fed back by the camera 192 .
- the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
- ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
- the ISP may be provided in the camera 193 .
- Camera 192 is used to capture still images or video.
- the object is projected through the lens to generate an optical image onto the photosensitive element.
- the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
- CMOS complementary metal-oxide-semiconductor
- the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
- the ISP outputs the digital image signal to the DSP for processing.
- DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
- the electronic device 100 may include 1 or N cameras 192 , where N is a positive integer greater than 1.
- a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy and so on.
- Video codecs are used to compress or decompress digital video.
- the electronic device 100 may support one or more video codecs.
- the electronic device 100 can play or record videos of various encoding formats, such as: Moving Picture Experts Group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4 and so on.
- MPEG Moving Picture Experts Group
- MPEG2 moving picture experts group
- MPEG3 MPEG4
- MPEG4 Moving Picture Experts Group
- the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100 .
- the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
- Internal memory 121 may be used to store computer executable program code, which includes instructions.
- the processor 110 executes various functional applications and data processing of the electronic device 100 by executing the instructions stored in the internal memory 121 .
- the internal memory 121 may include a storage program area and a storage data area.
- the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
- the storage data area may store data (such as audio data, phone book, etc.) created during the use of the electronic device 100 and the like.
- the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
- the electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
- the focusing module 180 which may also be referred to as a focusing module, includes a motor (including a motor driver and a motor body), a gear and an adjustment link.
- the adjustment link is connected with one or more lenses in the optical lens group
- the processor controls the motor driver to output a digital control signal
- the motor rotates after receiving the signal, and after being driven by gears
- the link drives the lens to move (adjust the lens spacing) , to realize the adjustment of the diopter of the optical lens group.
- the diopter described in the embodiments of the present application refers to the magnitude of the deflection of the light in the propagation direction when the light is incident from one object to another material with different optical densities.
- the adjustment of the diopter can be achieved by controlling a certain lens or the spacing between certain lenses.
- the sensor 181 optionally includes, but is not limited to, an acceleration sensor, a distance sensor, a proximity light sensor, an ambient light sensor, and the like.
- the proximity light sensor may include, for example, light emitting diodes (LEDs) and light detectors, such as photodiodes.
- the light emitting diodes may be infrared light emitting diodes.
- the electronic device 100 emits infrared light to the outside through light emitting diodes.
- Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
- the electronic device 100 can use the proximity light sensor to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
- the proximity light sensor can also be used in holster mode, pocket mode automatically unlocks and locks the screen.
- the proximity light sensor may also be used to detect whether the user is wearing a wearable device, such as a VR helmet.
- the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
- the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 .
- the indicator 191 can be an indicator light, which can be used to indicate a charging state, a change in power, or a message, a missed call, a notification, and the like.
- the SIM card interface 194 is used to connect a SIM card.
- the SIM card can be contacted and separated from the electronic device 100 by inserting into the SIM card interface 194 or pulling out from the SIM card interface 194 .
- the electronic device 100 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
- the embodiments of the present application take the Android system with a layered architecture as an example to illustrate the software structure of the electronic device 100.
- the electronic device 100 may also use the Windows system or other systems, which is not limited in this application.
- FIG. 4 is a block diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
- the layered architecture of the electronic device 100 divides the software into several layers, and each layer has a clear role and division of labor. Layers communicate with each other through software interfaces.
- the Android system is divided into four layers, which are, from top to bottom, an application layer, an application framework layer, an Android runtime (Android runtime) and a system library, and a kernel layer.
- the application layer can include a series of application packages.
- the application package may include applications such as Bluetooth, games, music, calendar, and WLAN.
- the application framework layer provides an application programming interface (application programming interface, API) and a programming framework for applications in the application layer.
- the application framework layer includes some predefined functions.
- the application framework layer may include window managers, content providers, view systems, telephony managers, resource managers, notification managers, and the like.
- a window manager is used to manage window programs.
- the window manager can get the size of the display screen, determine whether there is a status bar, lock the screen, take screenshots, etc.
- Content providers are used to store and retrieve data and make these data accessible to applications.
- the data may include video, images, audio, calls made and received, browsing history and bookmarks, phone book, etc.
- the view system includes visual controls, such as controls for displaying text, controls for displaying pictures, and so on. View systems can be used to build applications.
- a display interface can consist of one or more views.
- the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
- the phone manager is used to provide the communication function of the electronic device 100 .
- the management of call status including connecting, hanging up, etc.).
- the resource manager provides various resources for the application, such as localization strings, icons, pictures, layout files, video files, etc.
- the notification manager enables applications to display notification information in the status bar, which can be used to convey notification-type messages, and can disappear automatically after a brief pause without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
- the notification manager can also display notifications in the status bar at the top of the system in the form of graphs or scroll bar text, such as notifications of applications running in the background, and notifications on the screen in the form of dialog windows. For example, text information is prompted in the status bar, a prompt sound is issued, the electronic device vibrates, and the indicator light flashes.
- Android Runtime includes core libraries and a virtual machine. Android runtime is responsible for scheduling and management of the Android system.
- the core library consists of two parts: one is the function functions that the java language needs to call, and the other is the core library of Android.
- the application layer and the application framework layer run in virtual machines.
- the virtual machine executes the java files of the application layer and the application framework layer as binary files.
- the virtual machine is used to perform functions such as object lifecycle management, stack management, thread management, safety and exception management, and garbage collection.
- a system library can include multiple functional modules. For example: surface manager (surface manager), media library (Media Libraries), 3D graphics processing library (eg: OpenGL ES), 2D graphics engine (eg: SGL), etc.
- surface manager surface manager
- media library Media Libraries
- 3D graphics processing library eg: OpenGL ES
- 2D graphics engine eg: SGL
- the Surface Manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of a variety of commonly used audio and video formats, as well as still image files.
- the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
- the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, compositing, and layer processing.
- 2D graphics engine is a drawing engine for 2D drawing.
- the kernel layer is the layer between hardware and software.
- the kernel layer contains at least display driver, camera driver, audio driver, Bluetooth driver and Wi-Fi driver.
- the components or modules included in the application layer, application framework layer, system library and runtime layer, and kernel layer shown in FIG. 4 do not constitute a specific limitation on the electronic device 100 .
- the electronic device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
- FIG. 5 is a schematic flowchart of a method for controlling a wearable device according to an embodiment of the present application. Please refer to Figure 5, which includes:
- a sensor detects a wearing state of the wearable device.
- the wearable device may include a proximity light sensor.
- the proximity light sensor optionally includes a transmitting unit, a receiving unit and a computing unit, the transmitting unit is used to transmit detection signals (for example, the transmitting unit in the infrared proximity light sensor can transmit detection infrared rays), and the receiver unit is used to receive the reflected signal from the target object.
- the calculation unit calculates the time difference between the signal transmitted by the transmitter unit and the signal received by the receiver, and the distance between the sensor and the target object is calculated according to the propagation speed of the signal.
- the proximity light sensor can detect whether the user wears the wearable device based on the above principles.
- an infrared proximity light sensor is used as an example for description. In other embodiments, it can also be implemented by other sensors that can detect the wearing condition of the wearable device, such as an ultrasonic proximity light sensor and a laser proximity light sensor. This application is not limited.
- the proximity light sensor may detect periodically, and based on the detection signal received in the current period, it is determined that the wearable device is worn by the user.
- the proximity light sensor may also detect the wearing condition in real time, and this application only takes periodic detection as an example for description.
- the proximity light sensor may send a user wearing instruction to the processor for instructing the user to wear the wearable device.
- the proximity light sensor can detect whether the wearable device is in the worn state in the previous cycle.
- the proximity light The sensor does not need to send any indication.
- the proximity light sensor sends a user wearing instruction to the processor. That is to say, the proximity light sensor can send the user wearing indication information to the processor only when the wearable device changes from the unworn state to the worn state, thereby reducing the number of interactions between the sensor and the processor.
- the proximity light sensor determines that the wearable device is not worn by the user based on the detection signal received in the current period.
- the proximity light sensor may send the user non-wearing indication information to the processor, which is used to indicate that the user is not wearing the wearable device, that is, S102 is performed.
- the proximity light sensor can also detect whether the wearable device is in a worn state in the previous cycle.
- the proximity light sensor does not need to send any indication. If the wearable device is in the wearing state in the previous cycle, the proximity light sensor can send an indication that the user is not wearing it to the processor. That is to say, the proximity light sensor can send the user's non-wearing indication information to the processor only when the wearable device changes from the wearing state to the non-wearing state, which can be used to reduce the number of interactions between the sensor and the processor.
- the sensor sends an indication that the user is not wearing the device to the processor.
- the processor records the current state of the focusing module.
- FIG. 6 is a schematic structural diagram of an exemplary wearable device.
- the focusing module optionally includes a motor (not shown in the figure), a gear 601 and an adjusting link 602 .
- the motor in the embodiment of the present application is optionally a stepping motor or a servo motor.
- the motor controls the gear 601 to achieve a specific angular displacement by receiving the digital control signal from the motor driver, wherein the stepper motor is controlled by the number and frequency of pulses of the digital control signal, and the servo motor is controlled by the pulse duration of the digital control signal.
- the gear 601 when the processor records the current state of the focusing module, taking FIG. 6 as an example, the gear 601 is currently located at the position a of the adjustment connecting rod 602 . In the embodiment of the present application, it can be considered that the position of the motor is the position of the gear 601 .
- the motor may also be separated from the gear 601 , that is, the positions of the motor and the gear 601 are different, which is not limited in this application.
- the processor only needs to record the digital control signal corresponding to the position of the motor. When subsequent restoration is required, the processor may control the motor to restore the focusing module to the current state based on the recorded control signal.
- one end of the adjustment link 602 is connected to at least one optical lens in the optical lens group, for example, connected to the optical lens 604 .
- the motor controls the gear 601 to move on the adjusting connecting rod 602 based on the control signal output by the processor, so that the adjusting connecting rod 602 and the gear 601 can move relative to each other.
- the movement of the adjusting link 602 can make the optical lens 604 move, so as to adjust the horizontal distance between the optical lens 604 and the optical lens 603, so as to realize the adjustment of the diopter of the optical lens group.
- the user is currently wearing a wearable device (eg, a VR helmet or VR glasses).
- the proximity light sensor can detect that the user is wearing the wearable device, and send the user wearing instruction to the processor.
- the processor records the current position of the motor in response to the received user wearing instruction.
- the position of the motor is the position corresponding to the gear 601
- the gear 601 is located at the position a of the adjustment link 602 .
- the distance between the optical lens 604 and the optical lens 603 is the distance a.
- the light is focused on the screen 605 through the optical lens group to form a light spot a.
- the position of the motor (or gear) recorded by the processor can also be understood as the relative position between the gear and the adjustment link.
- the absolute position of the gear is unchanged, and its sawtooth rotation can drive the relative position change between the adjustment links.
- the processor controls the focusing module to adjust to the maximum diopter state.
- the processor may pre-configure the maximum diopter state.
- the maximum diopter state may be understood as a state corresponding to the focusing module when the diopter of the optical lens group is the maximum that can be achieved. It can be understood that the processor is pre-configured with a designated position. When the gear rotates so that the gear is placed at the designated position on the adjustment link, the diopter of the optical lens group is the maximum value. At this time, the state of the focusing module is the maximum diopter. state.
- FIG. 7 is an exemplary schematic diagram of position transformation of the focusing module.
- the processor can obtain the current state of the focusing module, that is, the gear 601 is at the position a of the adjusting link 602 .
- the processor can also obtain a pre-configured designated position, such as the position b shown in FIG. 7 , that is, the processor is pre-configured with a designated position, that is, the position b, and it is expected that the gear 601 is rotated to change the gear 601 and the adjustment link 602.
- the processor can obtain the displacement between the two positions a and the position b based on the current state of the adjustment module and the pre-configured designated position.
- the processor can control the rotation of the gear 601 through the control signal, so that the adjustment link moves laterally, so as to change the relative position between the gear 601 and the adjustment link until the position b is reached.
- the processor can record the changed displacement and the corresponding number of pulse signals for subsequent recovery process.
- the adjustment link 602 is moved laterally so that the optical lens 604 to which it is attached also moves laterally. Since the position of the optical lens 603 remains unchanged, the distance between the optical lens 604 and the optical lens 603 increases.
- the gear 601 is located at the position b of the adjusting link 602
- the distance between the optical lens 604 and the optical lens 603 is the distance b.
- the distance b is greater than the distance a.
- the diopter of the optical lens group increases.
- the ambient light is refracted to the screen 605 through the optical lens group, and a light spot b is formed on the screen 605 due to the increase of the diopter.
- the spot b is larger than the spot a.
- the relative positions of the optical lenses in the optical lens group are adjusted to change the diopter of the optical lens group. After the diopter of the optical lens group is increased, the ambient light cannot be focused on the screen, so as to achieve the effect of sun protection.
- the method of adjusting the relative positions of the lenses in the optical lens group by adjusting the connecting rods shown in FIG. 7 is only a schematic example.
- the relative positions of the optical lenses in the optical lens group can also be adjusted in other feasible ways, so as to increase the diopter of the optical lens group and achieve the purpose of sun protection.
- the designated positions described in the embodiments of the present application are only illustrative examples. Based on the characteristics of different optical lens groups, the relative positions corresponding to different dioptric powers thereof may be different. For example, for the optical lens group generated by manufacturer A, when the gear 601 is located on the leftmost side of the adjustment link 602, the optical lens group has the largest diopter, that is, when the gear 601 is located at the leftmost position of the adjustment link 602, it is The preset designated position corresponds to the maximum diopter state of the focusing module.
- the optical lens group generated by manufacturer B
- the optical lens group has the largest diopter, that is, the position when the gear 601 is located at the far right of the adjustment link 602 is the preset
- the specified position of which corresponds to the maximum diopter state of the focusing module. Therefore, the designated location can be set according to actual needs, which is not limited in this application.
- the processor starts a timer after receiving the non-wearing indication.
- the time duration of the timer can be set according to actual needs, for example, it can be 1 minute, which is not limited in this application.
- the processor may determine that the device is disabled. The processor may execute S103 and subsequent steps. Therefore, frequent opening and closing of the sun protection mode caused by the user starting and stopping the device in a short time can be avoided.
- the processor may execute S103 and subsequent steps, that is, enable the sun protection mode.
- the senor sends a user wearing instruction to the processor.
- the sensor optionally detects the wearing condition of the wearable device in real time or periodically. That is to say, during the execution of the above steps, the sensor continuously detects the wearing condition of the wearable device.
- the senor may send a user wearing instruction to the processor, which is used to instruct the user to wear the wearable device.
- the processor may send a user wearing instruction to the processor, which is used to instruct the user to wear the wearable device.
- the processor controls the focusing module to restore to the previous state.
- the processor may restore the focusing module to the pre-transformation state based on the maximum diopter state of the focusing module and the pre-transformation state.
- FIG. 8 is an exemplary schematic diagram of position transformation of the focusing module, so as to illustrate the recovery process of the focusing module.
- the processor records the number of pulse signals corresponding to the previous movement of the gear 601 from the position a to the position b of the adjustment link.
- the processor can control the motor to drive the gear 601 to rotate based on the number of recorded pulse signals, so that the gear 601 is restored from the position b on the adjustment link to the position a.
- the optical lens 604 moves with the adjustment link, and the distance between the optical lens 604 and the optical lens 603 is restored from the distance b to the distance a, so as to restore the diopter of the optical lens group.
- the ambient light passes through the optical lens group and is focused on the screen 605 to form a light spot a.
- the wearable control method shown in FIG. 5 is implemented based on the detection of the wearing state by the sensors in the wearable device.
- the wearable device does not include a sensor that can detect the wearing state, for example, does not include a proximity light sensor.
- the trigger condition of the sun protection mode may be based on the switch state of the wearable device.
- the wearable device may be provided with a switch for turning the wearable device on and off.
- the processor detects that the switch of the wearable device is turned off, that is, the wearable device changes from an on state to an off state, the processor can perform the above S103 and S104, that is, record the current state of the focusing module, and convert the focusing module to an off state.
- the processor when the processor detects that the wearable switch is turned on, that is, the wearable device changes from an off state to an on state, the processor can perform the above S106, that is, control the focusing module to return to the state before shutdown. That is to say, in this embodiment of the present application, turning on and off of the sun protection mode can be triggered by detecting the user's power-on and power-off operations.
- the electronic device includes corresponding hardware and/or software modules for executing each function.
- the present application can be implemented in hardware or in the form of a combination of hardware and computer software in conjunction with the algorithm steps of each example described in conjunction with the embodiments disclosed herein. Whether a function is performed by hardware or computer software driving hardware depends on the specific application and design constraints of the technical solution. Those skilled in the art may use different methods to implement the described functionality for each particular application in conjunction with the embodiments, but such implementations should not be considered beyond the scope of this application.
- FIG. 9 shows a schematic block diagram of an apparatus 900 according to an embodiment of the present application.
- the apparatus 900 may include: a processor 901 , a transceiver/transceiver pin 902 , and optionally, a memory 903 .
- bus 904 includes a power bus, a control bus and a status signal bus in addition to a data bus.
- bus 904 includes a power bus, a control bus and a status signal bus in addition to a data bus.
- the various buses are referred to as bus 904 in the figures.
- the memory 903 may be used for instructions in the foregoing method embodiments.
- the processor 901 can be used to execute the instructions in the memory 903, and control the receive pins to receive signals, and control the transmit pins to transmit signals.
- the apparatus 900 may be the electronic device or the chip of the electronic device in the above method embodiments.
- This embodiment also provides a computer storage medium, where computer instructions are stored in the computer storage medium, and when the computer instructions are executed on the electronic device, the electronic device executes the above-mentioned relevant method steps to implement the methods in the above-mentioned embodiments.
- This embodiment also provides a computer program product, which when the computer program product runs on a computer, causes the computer to execute the above-mentioned relevant steps, so as to implement the method in the above-mentioned embodiment.
- the embodiments of the present application also provide an apparatus, which may specifically be a chip, a component or a module, and the apparatus may include a connected processor and a memory; wherein, the memory is used for storing computer execution instructions, and when the apparatus is running, The processor can execute the computer-executed instructions stored in the memory, so that the chip executes the methods in the foregoing method embodiments.
- the electronic device, computer storage medium, computer program product or chip provided in this embodiment are all used to execute the corresponding method provided above. Therefore, for the beneficial effects that can be achieved, reference can be made to the corresponding provided above. The beneficial effects in the method will not be repeated here.
- the disclosed apparatus and method may be implemented in other manners.
- the apparatus embodiments described above are only illustrative.
- the division of modules or units is only a logical function division. In actual implementation, there may be other division methods.
- multiple units or components may be combined or May be integrated into another device, or some features may be omitted, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
- Units described as separate components may or may not be physically separated, and components shown as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed in multiple different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
- each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
- the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
- the integrated unit if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium.
- a readable storage medium including several instructions to make a device (which may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods in the various embodiments of the present application.
- the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.
- the steps of the method or algorithm described in conjunction with the disclosure of the embodiments of this application may be implemented in a hardware manner, or may be implemented in a manner in which a processor executes software instructions.
- Software instructions can be composed of corresponding software modules, and software modules can be stored in random access memory (Random Access Memory, RAM), flash memory, read only memory (Read Only Memory, ROM), erasable programmable read only memory ( Erasable Programmable ROM, EPROM), Electrically Erasable Programmable Read-Only Memory (Electrically EPROM, EEPROM), registers, hard disk, removable hard disk, CD-ROM, or any other form of storage medium known in the art.
- An exemplary storage medium is coupled to the processor, such that the processor can read information from, and write information to, the storage medium.
- the storage medium can also be an integral part of the processor.
- the processor and storage medium may reside in an ASIC.
- Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
- a storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims (12)
- 一种可穿戴设备的控制方法,其特征在于,包括:获取可穿戴设备的佩戴状态;当所述可穿戴设备从佩戴状态变为未佩戴状态,将所述可穿戴设备的光学镜片组的第一屈光度调整为第二屈光度,其中,所述第二屈光度大于所述第一屈光度。
- 根据权利要求1所述的方法,其特征在于,所述方法还包括:获取所述可穿戴设备的佩戴状态;当所述可穿戴设备从未佩戴状态变为佩戴状态,将所述可穿戴设备的光学镜片组的所述第二屈光度调整为所述第一屈光度。
- 根据权利要求1所述的方法,其特征在于,所述第二屈光度为所述光学镜片组可达到的屈光度最大值。
- 根据权利要求1所述的方法,其特征在于,所述当所述可穿戴设备从佩戴状态变为未佩戴状态,将所述可穿戴设备的光学镜片组的第一屈光度调整为第二屈光度,包括:当所述可穿戴设备从佩戴状态变为未佩戴状态,获取所述可穿戴设备的调焦模块的第一状态,其中,所述调焦模块用于控制所述光学镜片组的第一屈光度,所述调焦模块置于所述第一状态时,所述光学镜片组的屈光度为所述第一屈光度;将所述调焦模块调整为第二状态,其中,所述调焦模块置于所述第二状态时,所述光学镜片组的屈光度为所述第二屈光度。
- 根据权利要求1所述的方法,其特征在于,所述当所述可穿戴设备从佩戴状态变为未佩戴状态,将所述可穿戴设备的光学镜片组的第一屈光度调整为第二屈光度,包括:当所述可穿戴设备从佩戴状态变为未佩戴状态,且在设定的时长内保持未佩戴状态,将所述可穿戴设备的光学镜片组的所述第一屈光度调整为所述第二屈光度。
- 一种电子设备,其特征在于,包括:一个或多个处理器;存储器;以及一个或多个计算机程序,其中所述一个或多个计算机程序存储在所述存储器上,当所述计算机程序被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:获取可穿戴设备的佩戴状态;当所述可穿戴设备从佩戴状态变为未佩戴状态,将所述可穿戴设备的光学镜片组的第一屈光度调整为第二屈光度,其中,所述第二屈光度大于所述第一屈光度。
- 根据权利要求6所述的设备,其特征在于,当所述计算机程序被所述一个或多个处 理器执行时,使得所述电子设备执行以下步骤:获取所述可穿戴设备的佩戴状态;当所述可穿戴设备从未佩戴状态变为佩戴状态,将所述可穿戴设备的光学镜片组的所述第二屈光度调整为所述第一屈光度。
- 根据权利要求6所述的设备,其特征在于,所述第二屈光度为所述光学镜片组可达到的屈光度最大值。
- 根据权利要求6所述的设备,其特征在于,当所述计算机程序被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:当所述可穿戴设备从佩戴状态变为未佩戴状态,获取所述可穿戴设备的调焦模块的第一状态,其中,所述调焦模块用于控制所述光学镜片组的第一屈光度,所述调焦模块置于所述第一状态时,所述光学镜片组的屈光度为所述第一屈光度;将所述调焦模块调整为第二状态,其中,所述调焦模块置于所述第二状态时,所述光学镜片组的屈光度为所述第二屈光度。
- 根据权利要求6所述的设备,其特征在于,当所述计算机程序被所述一个或多个处理器执行时,使得所述电子设备执行以下步骤:当所述可穿戴设备从佩戴状态变为未佩戴状态,且在设定的时长内保持未佩戴状态,将所述可穿戴设备的光学镜片组的所述第一屈光度调整为第二屈光度。
- 一种计算机可读存储介质,其特征在于,包括计算机程序,当所述计算机程序在计算机上运行时,使得所述计算机执行如权利要求1-5中任意一项所述的方法。
- 一种芯片,其特征在于,包括一个或多个接口电路和一个或多个处理器;所述接口电路用于从电子设备的存储器接收信号,并向所述处理器发送所述信号,所述信号包括存储器中存储的计算机指令;当所述处理器执行所述计算机指令时,使得所述电子设备执行权利要求1-5中任意一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/553,540 US20240231038A9 (en) | 2021-04-08 | 2022-04-02 | Control Method for Wearable Device, and Electronic Device |
EP22784003.0A EP4300164A4 (en) | 2021-04-08 | 2022-04-02 | METHOD FOR CONTROLLING WEARABLE DEVICE AND ELECTRONIC DEVICE |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110379909.2A CN115202042A (zh) | 2021-04-08 | 2021-04-08 | 可穿戴设备的控制方法及电子设备 |
CN202110379909.2 | 2021-04-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022213937A1 true WO2022213937A1 (zh) | 2022-10-13 |
Family
ID=83545979
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/085118 WO2022213937A1 (zh) | 2021-04-08 | 2022-04-02 | 可穿戴设备的控制方法及电子设备 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240231038A9 (zh) |
EP (1) | EP4300164A4 (zh) |
CN (1) | CN115202042A (zh) |
WO (1) | WO2022213937A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115826250A (zh) * | 2023-02-14 | 2023-03-21 | 南昌龙旗智能科技有限公司 | Vr设备的镜片模组调节方法与装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4766684A (en) * | 1987-04-10 | 1988-08-30 | Wah Lo Allen K | Lenticular screen for outdoor display |
CN208283654U (zh) * | 2018-05-30 | 2018-12-25 | 潍坊歌尔电子有限公司 | 头戴式显示设备 |
CN208999665U (zh) * | 2018-11-14 | 2019-06-18 | 深圳创维新世界科技有限公司 | 光调控装置及虚拟现实头戴显示设备 |
CN110727114A (zh) * | 2019-10-31 | 2020-01-24 | 歌尔股份有限公司 | 一种头戴式显示设备 |
CN211786375U (zh) * | 2020-04-02 | 2020-10-27 | 成都忆光年文化传播有限公司 | 一种近眼显示设备 |
CN212483983U (zh) * | 2020-04-27 | 2021-02-05 | 歌尔光学科技有限公司 | 显示模组及虚拟现实设备 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180239143A1 (en) * | 2017-02-20 | 2018-08-23 | Google Llc | Protecting a display from external light sources |
US10444800B2 (en) * | 2017-03-15 | 2019-10-15 | Scott Sullivan | System for protecting headset components from sunlight |
US10859832B1 (en) * | 2018-05-29 | 2020-12-08 | Facebook Technologies, Llc | Mitigating light exposure to elements of a focus adjusting head mounted display |
KR20200032467A (ko) * | 2018-09-18 | 2020-03-26 | 삼성전자주식회사 | 외부 광의 투과율을 조절할 수 있는 광학 부재를 포함하는 전자 장치 및 그의 동작 방법 |
CN209281076U (zh) * | 2018-11-05 | 2019-08-20 | 青岛小鸟看看科技有限公司 | 虚拟现实设备 |
CN109298531A (zh) * | 2018-11-14 | 2019-02-01 | 深圳创维新世界科技有限公司 | 光调控装置、虚拟现实头戴显示设备及光调控方法 |
CN111487773B (zh) * | 2020-05-13 | 2022-08-19 | 歌尔科技有限公司 | 头戴设备调节方法、头戴设备及计算机可读存储介质 |
-
2021
- 2021-04-08 CN CN202110379909.2A patent/CN115202042A/zh active Pending
-
2022
- 2022-04-02 US US18/553,540 patent/US20240231038A9/en active Pending
- 2022-04-02 EP EP22784003.0A patent/EP4300164A4/en active Pending
- 2022-04-02 WO PCT/CN2022/085118 patent/WO2022213937A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4766684A (en) * | 1987-04-10 | 1988-08-30 | Wah Lo Allen K | Lenticular screen for outdoor display |
CN208283654U (zh) * | 2018-05-30 | 2018-12-25 | 潍坊歌尔电子有限公司 | 头戴式显示设备 |
CN208999665U (zh) * | 2018-11-14 | 2019-06-18 | 深圳创维新世界科技有限公司 | 光调控装置及虚拟现实头戴显示设备 |
CN110727114A (zh) * | 2019-10-31 | 2020-01-24 | 歌尔股份有限公司 | 一种头戴式显示设备 |
CN211786375U (zh) * | 2020-04-02 | 2020-10-27 | 成都忆光年文化传播有限公司 | 一种近眼显示设备 |
CN212483983U (zh) * | 2020-04-27 | 2021-02-05 | 歌尔光学科技有限公司 | 显示模组及虚拟现实设备 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4300164A4 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115826250A (zh) * | 2023-02-14 | 2023-03-21 | 南昌龙旗智能科技有限公司 | Vr设备的镜片模组调节方法与装置 |
Also Published As
Publication number | Publication date |
---|---|
CN115202042A (zh) | 2022-10-18 |
EP4300164A4 (en) | 2024-10-02 |
EP4300164A1 (en) | 2024-01-03 |
US20240134148A1 (en) | 2024-04-25 |
US20240231038A9 (en) | 2024-07-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11831977B2 (en) | Photographing and processing method and electronic device | |
US11595566B2 (en) | Camera switching method for terminal, and terminal | |
CN109582141B (zh) | 根据眼球焦点控制显示屏的方法和头戴电子设备 | |
US11800221B2 (en) | Time-lapse shooting method and device | |
KR102524498B1 (ko) | 듀얼 카메라를 포함하는 전자 장치 및 듀얼 카메라의 제어 방법 | |
US20220030105A1 (en) | Screenshot Generating Method, Control Method, and Electronic Device | |
JP2022522453A (ja) | 記録フレームレート制御方法及び関連装置 | |
WO2020093833A1 (zh) | 一种接近光传感器的控制方法及电子设备 | |
WO2022100685A1 (zh) | 一种绘制命令处理方法及其相关设备 | |
WO2022037726A1 (zh) | 分屏显示方法和电子设备 | |
WO2022001258A1 (zh) | 多屏显示方法、装置、终端设备及存储介质 | |
WO2022170854A1 (zh) | 视频通话的方法与相关设备 | |
WO2022213937A1 (zh) | 可穿戴设备的控制方法及电子设备 | |
WO2022262291A1 (zh) | 应用的图像数据调用方法、系统、电子设备及存储介质 | |
CN117995137B (zh) | 一种调节显示屏色温的方法、电子设备及相关介质 | |
CN116069223B (zh) | 一种防抖方法、防抖装置和可穿戴设备 | |
CN116069222B (zh) | 一种识别焦点视图的方法、装置和可穿戴设备 | |
US20240111475A1 (en) | Screen mirroring method, apparatus, device, and storage medium | |
CN116204059B (zh) | 眼动追踪的帧率调整方法和装置 | |
WO2022206659A1 (zh) | 一种投屏方法及相关装置 | |
WO2020029213A1 (zh) | 通话发生srvcc切换时,接通和挂断电话的方法 | |
CN110543305A (zh) | 替换EasyUI组件的方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22784003 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022784003 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18553540 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 2022784003 Country of ref document: EP Effective date: 20230928 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |