WO2023040562A1 - 信息显示方法、近眼显示设备以及电子设备 - Google Patents

信息显示方法、近眼显示设备以及电子设备 Download PDF

Info

Publication number
WO2023040562A1
WO2023040562A1 PCT/CN2022/113110 CN2022113110W WO2023040562A1 WO 2023040562 A1 WO2023040562 A1 WO 2023040562A1 CN 2022113110 W CN2022113110 W CN 2022113110W WO 2023040562 A1 WO2023040562 A1 WO 2023040562A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
information
display device
eye display
area
Prior art date
Application number
PCT/CN2022/113110
Other languages
English (en)
French (fr)
Inventor
林鼎豪
陈碧莹
张宇
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2023040562A1 publication Critical patent/WO2023040562A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present application relates to the technical field of near-eye display devices, and in particular to an information display method, near-eye display devices and electronic devices.
  • Near-eye display devices such as smart glasses, as a combination of the latest IT technology and the functions of traditional glasses, are becoming more and more popular among consumers due to their advantages of portability, ease of use, and rich functions.
  • Embodiments of the present application provide an information display method, a near-eye display device, and an electronic device, which can improve the display effect of the near-eye display device.
  • An embodiment of the present application provides an information display method, which is applied to a near-eye display device, and the information display method includes:
  • a second image is displayed according to the pose information, the second image is part of a source image, and the second image is different from the first image.
  • the present application also provides an information display method, which is applied to electronic equipment and stores active images, and the method includes:
  • the second image is a part of the source image, and the second image is different from the first image
  • the second image is sent to a near-eye display device.
  • the present application also provides a near-eye display device, the near-eye display device is used for performing the above-mentioned information display method.
  • the present application also provides a near-eye display device, the near-eye display device comprising:
  • display means for displaying a first image, the first image being part of the source image
  • a touch module for responding to an image display update command
  • an attitude sensor configured to acquire attitude information of the near-eye display device
  • the display device is further configured to display a second image according to the posture information, the second image is a part of the source image, and the second image is different from the first image.
  • the present application also provides an electronic device, which is configured to execute the above information display method.
  • FIG. 1 is a schematic diagram of a first structure of a near-eye display device provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a second structure of a near-eye display device provided by an embodiment of the present application.
  • FIG. 3 is a schematic flowchart of a first information display method provided by an embodiment of the present application.
  • FIG. 4 is a first schematic diagram of a source image provided by an embodiment of the present application.
  • FIG. 5 is a second schematic diagram of a source image provided by an embodiment of the present application.
  • FIG. 6 is a third schematic diagram of a source image provided by an embodiment of the present application.
  • FIG. 7 is a first view of a rotating scene of a near-eye display device provided by an embodiment of the present application.
  • Fig. 8 is a second view of a rotating scene of the near-eye display device provided by the embodiment of the present application.
  • FIG. 9 is a schematic diagram of a first display of a near-eye display device provided by an embodiment of the present application.
  • FIG. 10 is a second schematic diagram of a near-eye display device provided by an embodiment of the present application.
  • Fig. 11 is a third display schematic diagram of the near-eye display device provided by the embodiment of the present application.
  • FIG. 12 is a second schematic flowchart of the information display method provided by the embodiment of the present application.
  • FIG. 13 is a schematic diagram of a first application scenario of a near-eye display device provided by an embodiment of the present application.
  • FIG. 14 is a schematic diagram of a second application scenario of a near-eye display device provided by an embodiment of the present application.
  • FIG. 15 is a schematic diagram of a third application scenario of a near-eye display device provided by an embodiment of the present application.
  • FIG. 16 is a schematic diagram of a fourth application scenario of a near-eye display device provided by an embodiment of the present application.
  • FIG. 17 is a schematic diagram of a fifth application scenario of the near-eye display device provided by the embodiment of the present application.
  • FIG. 18 is a schematic flowchart of a third information display method provided by an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • An embodiment of the present application provides an information display method, which is applied to a near-eye display device, and the method includes:
  • a second image is displayed according to the pose information, the second image is part of a source image, and the second image is different from the first image.
  • an area of the second image in the source image has an intersection with an area of the first image in the source image.
  • the second image includes a first sub-image area and a second sub-image area, the second sub-image area of the second image is set adjacent to the first sub-image area, and the The first sub-image area is located in the intersection area of the second image and the first image in the source image.
  • the second sub-image area of the second image is set adjacent to an edge side of the first image.
  • the near-eye display device includes a visible area
  • the displaying the second image according to the posture information includes:
  • a second image corresponding to the target area is displayed in the visible area.
  • the sizes of the first image and the second image displayed in the visible area are the same.
  • the second image includes auxiliary display information
  • the auxiliary display information includes parameter information of the near-eye display device and/or parameter information of an electronic device connected to the near-eye display device one or more of.
  • the parameter information includes one or more combinations of power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters.
  • after displaying the first image further include:
  • the step of acquiring the posture information of the near-eye display device, and displaying the second image according to the posture information specifically includes:
  • a second image corresponding to the target area is displayed in the visible area.
  • the information display method further includes:
  • the near-eye display device switches the displayed second image to display the first image.
  • the trigger signal of the image display update instruction includes at least one of a touch signal, a voice signal, an image signal, and an action signal.
  • the near-eye display device includes a wearing component, the wearing component is provided with a touch module, the touch module is used to receive the touch signal, and the touch The module receives the touch signal of the image display update instruction, and the near-eye display device responds to the image display update instruction to acquire the posture information of the near-eye display device.
  • the embodiment of the present application also provides an information display method, which is applied to electronic equipment and stores active images, and the method includes:
  • the second image is a part of the source image, and the second image is different from the first image
  • the second image is sent to a near-eye display device.
  • an area of the second image in the source image has an intersection with an area of the first image in the source image.
  • the second image includes a first sub-image area and a second sub-image area, the second sub-image area of the second image is set adjacent to the first sub-image area, and the The first sub-image area is located in the intersection area of the second image and the first image in the source image.
  • the second sub-image area of the second image is set adjacent to an edge side of the first image.
  • the determining the second image according to the posture information includes:
  • Sending the second image to the near-eye display device includes:
  • the near-eye display device includes a visible area, and the second image corresponding to the target area is sent to the near-eye display device, so that the near-eye display device displays the second image corresponding to the target area in the visible area. image.
  • the sizes of the first image and the second image displayed in the visible area are the same.
  • the second image includes auxiliary display information
  • the auxiliary display information includes one of parameter information of the electronic device and/or parameter information of a near-eye display device connected to the electronic device. one or more species.
  • the parameter information includes one or more combinations of power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters.
  • Acquiring the posture information of the near-eye display device, determining a second image according to the posture information, and sending the second image to the near-eye display device includes:
  • the information display method after sending the second image to the near-eye display device, the information display method further includes:
  • the first image is sent to the near-eye display device.
  • An embodiment of the present application further provides a near-eye display device, and the near-eye display device is configured to execute the information display method as described above.
  • the embodiment of the present application also provides a near-eye display device, and the near-eye display device includes:
  • display means for displaying a first image, the first image being part of the source image
  • a touch module for responding to an image display update command
  • an attitude sensor configured to acquire attitude information of the near-eye display device
  • the display device is further configured to display a second image according to the posture information, the second image is a part of the source image, and the second image is different from the first image.
  • the near-eye display device further includes:
  • the display device includes a visible area, the visible area is set on the wearing component, and the visible area is used to display the second image, the trigger module includes a touch button, set on the wearing component, A trigger signal for receiving the image display update instruction.
  • An embodiment of the present application further provides an electronic device, the electronic device is configured to execute the information display method as described above.
  • the near-eye display device can be a smart near-eye display device, such as smart glasses, smart headgear, smart helmet, and other smart near-eye display devices.
  • the smart glasses can be AR (Augmented Reality) glasses or VR (Virtual Reality) glasses. Reality) glasses, etc.
  • near-eye display devices usually include some electronic devices such as power supplies, display devices, wearing components, and sensors.
  • the near-eye display device can realize a preset function according to a user's operation, such as displaying a picture through a display device.
  • FIG. 1 is a schematic diagram of a first structure of a near-eye display device provided in an embodiment of the present application.
  • the near-eye display device 10 may include a wearing assembly 11 and a display device 12, wherein the display device 12 may include a display 121 and a projector 124, the display 121 may be a lens, and the lens may be a waveguide lens with a grating structure, or a plano glasses lens , sunglasses lenses, prescription glasses lenses, prescription sunglasses lenses and other forms, prescription glasses lenses or prescription sunglasses lenses refer to glasses lenses or sunglasses lenses equipped according to a prescription (or optometry list).
  • the projector 124 can make the display 121 display image information through different display modes, such as fiber scanning mode (Fiber Scanning Display, FSD), digital light processing mode (Digital Light Processing, DLP), laser scanning mode (Laser Beam Scanning, LBS) etc., the projector 124 can be fixedly installed on the wearing assembly 11, and can also be detachably connected to the wearing assembly 11.
  • the display 121 and the projector 124 can be integrated, and the integrated display device 12 can be fixedly installed on the
  • the wearing component 11 can also be detachably connected with the wearing component 11, and the user can directly watch the image information through the integrated display device 12 without glasses.
  • the wearing assembly 11 can be used as the frame structure of the near-eye display device 10.
  • the wearing assembly 11 can also include a spectacle frame 111 and a mirror leg 112.
  • the spectacle frame 111 can be provided with the display 121 (lens) as described above to display image information.
  • the lens may not be set, or only ordinary lenses may be set, and the image information is displayed through the integrated display device, and the user can wear the display device on the user's head through the temple 112, and directly watch the image displayed on the display device information without showing it through the glasses.
  • the temples 112 are respectively connected to the opposite sides of the frame 111.
  • the user can wear the near-eye display device 10 on the user's head through the temples 112.
  • the near-eye display device 10 can also be worn on the user's head through other wearing components.
  • head such as elastic, magnetic or adhesive connecting straps or connecting buckles.
  • the near-eye display device includes a visual area, please refer to FIG. 2 , which is a second structural schematic diagram of the near-eye display device provided by the embodiment of the present application.
  • the visible area 122 is the area where the user's glasses can see the display image information, which is related to the field of view (Field of view, FOV) of the near-eye display device.
  • FOV Field of view
  • the FOV of a display device of a near-eye display device such as AR glasses may be 20°, 40°, 60°, 80°, or the like.
  • the size of the FOV of the display device is related to the structure of the display and the projector.
  • the FOV of the display device is related to the structure of the waveguide lens.
  • the viewing area of the display device 12 for displaying image information is limited, and cannot
  • this application provides an information display method that can improve the display effect of the visible area of near-eye display devices.
  • Figure 3 is provided by the embodiment of this application.
  • the near-eye display device can store active images, and the source image can include image information to be displayed by the display device.
  • the source image can also be generated in a server or other electronic devices connected to the near-eye display device, through wired or wireless
  • the transmitted data transmission method is sent to the near-eye display device.
  • Other electronic devices may be electronic devices such as smart phones, tablet computers, PDAs (Personal Digital Assistant), smart watches, and smart bracelets.
  • the first image may be the image information that the user currently needs to view.
  • the first image may include application information, communication information, audio information, and video information of the near-eye display device or an electronic device connected to the near-eye display device.
  • An image can also include other information, such as auxiliary display information that the user does not need to view in real time.
  • the auxiliary display information can be parameter information of the near-eye display device and/or parameter information of electronic devices connected to the near-eye display device, such as power parameters, network parameters, time parameters, audio playback parameters, display parameters, etc.
  • the position of the auxiliary display information in the source image may be around the position of the first image, such as being set on one side of the edge of the first image, or set around the edge of the first image.
  • Figure 4 is the first schematic diagram of the source image provided by the embodiment of the present application
  • Figure 5 is the second schematic diagram of the source image provided by the embodiment of the present application
  • Figure 6 is the source image provided by the embodiment of the present application
  • the first image includes application information related to the fitness application.
  • the auxiliary display information includes power parameters and time parameters
  • the auxiliary display information includes power parameters, Time parameters and network parameters.
  • the auxiliary display information includes power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters, where the audio playback parameters can include volume parameters and previous current audio application playback parameters.
  • display parameters may include display brightness parameters.
  • first image and auxiliary display information in the illustration are only exemplary.
  • the first image can also be the image information that the user currently needs to view according to actual needs, and the auxiliary display information can also be set according to actual needs. Information that does not need to be viewed in real time.
  • the image display update instruction may be generated according to a trigger signal
  • the trigger signal may be a trigger signal triggered by a user through a trigger module
  • the trigger signal may include at least one of a touch signal, a voice signal, an image signal and an action signal.
  • the touch module can be a touch sensor, and the touch sensor is used to obtain a touch signal, and the touch signal can be a pressing signal and/or a sliding signal.
  • the touch module can also be an audio collection sensor, such as a microphone, for collecting voice signals, and the voice signal can be an audio signal that meets preset conditions collected through the microphone.
  • the touch module can also use an image acquisition sensor, such as a camera, to collect image signals, and the image signals are image signals that meet preset conditions and are collected by the camera.
  • the touch module may be a motion collection sensor, such as a posture sensor, and the motion signal may be a motion signal collected by the posture sensor that satisfies a preset condition. It can be understood that other types of touch modules and corresponding trigger signals can also be set according to actual needs. If the above trigger signal is received, an image display update instruction is generated according to the trigger signal, and attitude information of the near-eye display device is acquired.
  • the near-eye display device may include an attitude sensor, and the attitude sensor may collect attitude information of the near-eye display device.
  • Fig. 8 is a second view of a rotating scene of the near-eye display device provided by the embodiment of the present application.
  • the attitude sensor 13 is used to detect the attitude information of the near-eye display device 10 .
  • the attitude sensor 13 may include a gyroscope, an electronic compass, an acceleration sensor and/or a Hall sensor.
  • the attitude sensor 13 can realize 3 degree of freedom detection (3degreerissaedom, 3DOF) or 6 degree of freedom detection (6degreeoffreedom, 6DOF) of the near-eye display device.
  • the near-eye display device can realize 3DOF as an example, and the near-eye display device can detect the first degree of freedom through the attitude sensor 13. Attitude information of one degree of freedom rotation, attitude information of second degree of freedom rotation and attitude information of third degree of freedom rotation.
  • the second image may include the above-mentioned auxiliary display information.
  • FIG. 9 is a first display schematic diagram of the near-eye display device provided by the embodiment of the present application
  • FIG. 10 is a second display schematic diagram of the near-eye display device provided by the embodiment of the present application.
  • the second image is part of the source image, the second image is different from the first image, for example, as shown in Figure 10, the position of the second image in the source image is not the same as the position of the first image in the source image , the region of the second image in the source image intersects the region of the first image in the source image.
  • the first image includes fitness application information
  • the second image includes auxiliary display information.
  • the corresponding trigger signal can be triggered through the trigger module.
  • the attitude sensor collects the attitude information of the near-eye display device, and displays the power parameters that the user needs to view according to the attitude information.
  • the second image may include application information and auxiliary display information of a part of the first image. Wherein, the intersection of the area of the first image in the source image and the area of the second image in the source image is a partial area corresponding to the application information.
  • the second image includes a first sub-image area and a second sub-image area
  • the second sub-image area of the second image is disposed adjacent to the first sub-image area
  • the first sub-image area is located between the second image and the second sub-image area.
  • An image is within the intersection region in the source image.
  • the first sub-image area may be an area including part of the application information
  • the second sub-image area may be an area including auxiliary display information
  • the first sub-image area is located between the second image and the first image.
  • the first sub-image area may completely overlap with the intersection area, or partially overlap with the intersection area.
  • the second sub-image area of the second image is set adjacent to one side of the edge of the first image, as shown in FIG. 10 , the second sub-image area of the second image is set adjacent to a side of the first image
  • the second sub-image area of the second image is set adjacent to two adjacent sides of the first image. As shown in Figure 11.
  • the near-eye display device before displaying the second image information according to the posture information, may not display the first image, for example, when the near-eye display device is in a low energy consumption mode or a low power state, the visual area is in a black screen state,
  • the posture information of the near-eye display device is collected by the posture sensor, and the second image is displayed according to the posture information, which can also satisfy the user's requirements for the near-eye display device when it is in a low energy consumption mode or a low power mode.
  • the reading of parameters increases the readability of auxiliary display information.
  • the information display method provided by the embodiment of the present application displays the first image that is a part of the source image on the near-eye display device in advance, obtains the posture information of the near-eye display device after responding to the image display update instruction, and converts the second image that is a part of the source image according to the posture information.
  • the second image may include some information that does not need to be viewed in real time, and is displayed according to the posture information of the near-eye display device when viewing is required.
  • the display content is highly flexible, and the display effect of the near-eye display device is improved.
  • FIG. 12 is a second schematic flowchart of the information display method provided by the embodiment of the present application.
  • the information synchronization method includes:
  • the near-eye display device can store active images, and the source image can include image information to be displayed by the display device.
  • the source image can also be generated in a server or other electronic devices connected to the near-eye display device, through wired or wireless
  • the transmitted data transmission method is sent to the near-eye display device.
  • Other electronic devices may be electronic devices such as smart phones, tablet computers, PDAs (Personal Digital Assistant), smart watches, and smart bracelets.
  • the first image may be the image information that the user currently needs to view.
  • the first image may include application information, communication information, audio information, and video information of the near-eye display device or an electronic device connected to the near-eye display device.
  • An image can also include other information, such as auxiliary display information that the user does not need to view in real time.
  • the auxiliary display information can be parameter information of the near-eye display device and/or parameter information of electronic devices connected to the near-eye display device, such as power parameters, network parameters, time parameters, audio playback parameters, display parameters, etc.
  • the position of the auxiliary display information in the source image may be around the position of the first image, such as being set on one side of the edge of the first image, or set around the edge of the first image.
  • Figure 4 is the first schematic diagram of the source image provided by the embodiment of the present application
  • Figure 5 is the second schematic diagram of the source image provided by the embodiment of the present application
  • Figure 6 is the source image provided by the embodiment of the present application
  • the first image includes application information related to the fitness application.
  • the auxiliary display information includes power parameters and time parameters
  • the auxiliary display information includes power parameters, Time parameters and network parameters.
  • the auxiliary display information includes power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters, where the audio playback parameters can include volume parameters and previous current audio application playback parameters.
  • display parameters may include display brightness parameters.
  • first image and auxiliary display information in the illustration are only exemplary.
  • the first image can also be the image information that the user currently needs to view according to actual needs, and the auxiliary display information can also be set according to actual needs. Information that does not need to be viewed in real time.
  • the source image may include information about fitness applications and auxiliary display information.
  • the auxiliary display information may include battery parameters, time parameters, and network parameters adjacent to the top of the fitness application.
  • the parameters that the user does not need to view in real time are hidden, and only the fitness application information that the user needs to view in real time is displayed in the visible area before the trigger signal of the image display update command is triggered.
  • the auxiliary display information may also be image information related to the display image of the first image, for example, the first image includes application information, and the auxiliary display information may include information related to the application information.
  • the first image information includes the main information of the fitness application
  • the auxiliary display information may be information related to the fitness application.
  • the first image includes the icon of the fitness application, current fitness items, calories burned, and heart rate information.
  • the displayed information may include the specific information of the fitness duration and the current fitness item (the current fitness item is running, and the specific information of the current fitness item is the number of running kilometers and steps, etc.). content generation.
  • the auxiliary display information may also include image information related to external image signals collected by the current near-eye display device, and the near-eye display device may also include a camera, which is used to collect external image information to realize interactive functions.
  • the first image generated after the interactive function can be the image information collected by the camera, and the auxiliary display information can be the recognized information obtained based on the content recognition of the first image.
  • the user turns on the camera of the near-eye display device to collect real-time external Image information, image information is recognized by graphic and text recognition, face recognition, object recognition or scene recognition, etc. to obtain the recognized information, and the recognized information is set around the first image as auxiliary display information.
  • the user needs When viewing, it can be viewed through the trigger signal that triggers the image display command.
  • the first image and auxiliary display information in the illustration are only exemplary.
  • the first image can be the image information that the user currently needs to view according to actual needs, and the auxiliary display information can be set according to actual needs. Information.
  • the image display update instruction may be generated according to a trigger signal
  • the trigger signal may be a trigger signal triggered by a user through a trigger module
  • the trigger signal may include at least one of a touch signal, a voice signal, an image signal and an action signal.
  • the touch module can be a touch sensor, and the touch sensor is used to obtain a touch signal, and the touch signal can be a pressing signal and/or a sliding signal.
  • the touch module can also be an audio collection sensor, such as a microphone, for collecting voice signals, and the voice signal can be an audio signal that meets preset conditions collected through the microphone.
  • the touch module can also use an image acquisition sensor, such as a camera, to collect image signals, and the image signals are image signals that meet preset conditions and are collected by the camera.
  • the touch module may be a motion collection sensor, such as a posture sensor, and the motion signal may be a motion signal collected by the posture sensor that satisfies a preset condition. It can be understood that other types of touch modules and corresponding trigger signals can also be set according to actual needs. If the above trigger signal is received, an image display update instruction is generated according to the trigger signal, and attitude information of the near-eye display device is acquired.
  • the trigger module can be a touch button 14 arranged on the wearing component 11.
  • the touch button includes a touch sensor.
  • the touch button 14 is used to receive a touch signal triggered by the user.
  • the user can press, touch Touching, approaching and/or sliding the touch button triggers a corresponding trigger signal.
  • the near-eye display device may include an attitude sensor.
  • the attitude sensor collects attitude information of the near-eye display device. Please refer to FIG. 1 , FIG. 7 and FIG. 8 .
  • FIG. 8 is a second view of a rotating scene of a near-eye display device provided by an embodiment of the present application.
  • the attitude sensor 13 is used to detect the attitude information of the near-eye display device 10 .
  • the attitude sensor 13 may include a gyroscope, an electronic compass, an acceleration sensor and/or a Hall sensor.
  • the attitude sensor 13 can realize 3 degree of freedom detection (3degreeoffreedom, 3DOF) or 6 degree of freedom detection (6degreeoffreedom, 6DOF) of the near-eye display device.
  • the near-eye display device can realize 3DOF as an example, and the near-eye display device can detect the first degree of freedom through the attitude sensor 13. Attitude information of one degree of freedom rotation, attitude information of second degree of freedom rotation and attitude information of third degree of freedom rotation.
  • the attitude information collected by the attitude sensor within the preset time period can be converted to obtain the position change information of the near-eye display device.
  • the position change information can be the position change information of the near-eye display device in space.
  • the area changes with the position of the near-eye display device.
  • the position information of the near-eye display device and the position information of the visible area of the near-eye display device.
  • the visible area is location information in space.
  • the position of the near-eye display device in space changes, which is converted into a change in the position of the visible area.
  • the target area is determined from the source image.
  • the target area may include the first A portion of an image and auxiliary display information.
  • the above-mentioned projector can project the second image corresponding to the target area to the visible area for display, and the user can view the information of the auxiliary image in the visible area by changing the position of the near-eye display device.
  • the target area can be determined by:
  • the coordinate information of the entire image information of the first image on the source image is obtained, and the coordinate information can be the target frame generated on the source image according to the size of the visible area Coordinate information, calculate the position change amount of the near-eye display device according to the posture information, obtain the position change amount of the corresponding visible area according to the position change amount of the near-eye display device, and calculate according to the position change amount of the visible area and the coordinate information of the target frame
  • the coordinate information of the target area is obtained, and the image of the target area is determined according to the coordinate information of the target area.
  • the target area includes a part of the first image in the area of the source image and at least a part of the auxiliary display information in the area of the source image.
  • the second image belongs to a part of the source image, and the second image is different from the first image.
  • the second image may include the aforementioned auxiliary display information.
  • the area of the second image in the source image overlaps with the area of the first image in the source image.
  • the second image Before responding to the image display update instruction, as shown in Figure 9, what the user can observe from the near-eye display device is the first image, and after responding to the image display update instruction, what the user can observe from the near-eye display device is the second image.
  • image the second image is part of the source image, the second image is different from the first image, for example, as shown in Figure 10, the position of the second image in the source image is not the same as the position of the first image in the source image , the region of the second image in the source image intersects the region of the first image in the source image.
  • the first image includes fitness application information
  • the second image includes auxiliary display information. In actual application scenarios, when the user needs to view the power parameters in the auxiliary display information, the corresponding trigger signal can be triggered through the trigger module.
  • the attitude sensor collects the attitude information of the near-eye display device, and displays the battery parameters that the user needs to view according to the attitude information.
  • the second image may include application information and auxiliary display information of a part of the first image.
  • the intersection of the area of the first image in the source image and the area of the second image in the source image is a partial area corresponding to the application information.
  • the second image includes a first sub-image area and a second sub-image area
  • the second sub-image area of the second image is disposed adjacent to the first sub-image area
  • the first sub-image area is located between the second image and the second sub-image area.
  • An image is within the intersection region in the source image.
  • the first sub-image area may be an area including part of the application information
  • the second sub-image area may be an area including auxiliary display information
  • the first sub-image area is located between the second image and the first image.
  • the first sub-image area may completely overlap with the intersection area, or partially overlap with the intersection area.
  • the second sub-image area of the second image is set adjacent to one side of the edge of the first image, as shown in FIG. 10 , the second sub-image area of the second image is set adjacent to a side of the first image
  • the second sub-image area of the second image is set adjacent to two adjacent sides of the first image. As shown in Figure 11.
  • the first image and the second image displayed in the viewable area have the same size.
  • the resolution, brightness, transparency and other image parameters of the first image displayed in the visible area and the second image are the same, and for example, the area size of the first image displayed in the visible area on the source image is the same as that of the second image
  • the area sizes on the source images are the same, and for another example, the display positions and/or display sizes of the first image and the second image in the visible area are the same.
  • the second image can also be adjusted according to the user's preference in the display mode in the viewable area, that is, the sizes of the first image and the second image displayed in the viewable area can also be different.
  • Figure 13 is a schematic diagram of the first application scenario of the near-eye display device provided by the embodiment of the application
  • Figure 14 is the near-eye display device provided by the embodiment of the application
  • FIG. 15 is a schematic diagram of a third application scenario of a near-eye display device provided by an embodiment of the present application
  • FIG. 16 is a schematic diagram of a fourth application scenario of a near-eye display device provided by an embodiment of the present application
  • Fig. 17 is a schematic diagram of a fifth application scenario of the near-eye display device provided by the embodiment of the present application.
  • the near-eye display device displays a first image, which may include the currently viewed application information, etc., and the first image is a part of the source image.
  • the image can also include auxiliary display information.
  • the auxiliary display information includes battery parameters, time parameters, and network parameters adjacent to the top of the first image, and music playback adjacent to the bottom of the first image.
  • the parameters that the user does not need to view in real time are hidden.
  • the user can display all the information of the first image in the visible area 122.
  • the first image can include application information that the user needs to view in real time, such as fitness application-related Information.
  • the near-eye display device When the user triggers the relevant trigger signal through the trigger module, the near-eye display device responds to the image update instruction corresponding to the trigger signal, wherein the user can trigger the corresponding trigger signal through the trigger mode as shown in Figure 7.
  • the trigger mode The group can be the touch button 14 arranged on the wearing component 11, the touch button includes a touch sensor, the touch button 14 is used to receive the touch signal triggered by the user, and the user can trigger the pressing signal by pressing the touch button with a finger, so as to The trigger image displays the trigger signal corresponding to the update command.
  • the near-eye display device determines the target area from the source image according to the corresponding posture information.
  • the target area includes the volume parameter of the auxiliary display information and part of the fitness application information of the first image.
  • the volume parameter corresponding to the target area and The image information of part of the information of the fitness application is displayed in the visible area, the volume parameter corresponding to the auxiliary display information is displayed in the visible area, and part of the application information related to the fitness application originally displayed in the visible area is hidden.
  • the user can rotate the The near-eye display device sees volume parameters that pre-exist in the source image but are invisible, where the volume parameter can be the volume parameter of the audio file currently played by the near-eye display device or other electronic devices connected to the near-eye display device, or it can be a near-eye display
  • the volume parameter when the device or other electronic devices connected to the near-eye display device are in a call state.
  • the user can turn the head to the right, as shown in Figure 15, after the user turns the head to the right, the near-eye display device determines the target area from the source image according to the corresponding posture information, and the target area includes the brightness parameters of the auxiliary display information and the partial information of the fitness application in the first image, displaying the brightness parameter corresponding to the target area and the image information of the partial information of the fitness application in the first image in the visible area, and displaying the application information related to the fitness application originally displayed in the visible area Part of it is hidden, and the user can view the pre-existing but invisible brightness parameters by turning the near-eye display device.
  • the brightness parameter may be a brightness parameter of a display screen in the visible area of the near-eye display device.
  • the user can turn the head upwards, as shown in Figure 16.
  • the near-eye display device determines the target area from the source image according to the corresponding posture information.
  • the target area includes the power parameter, time parameter and Network parameters and partial information of the fitness application in the first image, display the power parameter, time parameter, and network parameter corresponding to the target area in the visible area with the image information of the partial information of the fitness application in the first image, and display the original display in the visible area Part of the application information related to the fitness application is hidden, and the user can view the pre-existing but invisible power parameters, time parameters and network parameters by turning the near-eye display device.
  • the power parameter may be the power parameter of the near-eye display device
  • the network information may be the network information connected to the near-eye display device, such as signal strength and network type.
  • the near-eye display device determines the target area from the source image according to the corresponding posture information, and the target area includes the audio playback parameters of the auxiliary image information and
  • the notification message and the partial information of the fitness application in the first image display the audio playback parameters corresponding to the target area and the image information of the notification message and the partial information of the fitness application in the first image in the visible area, and display the fitness information originally displayed in the visible area.
  • Part of application-related application information is hidden, and the user can view pre-existing but invisible audio playback parameters and notification messages by turning the near-eye display device.
  • the notification information may be an application notification message received by the near-eye display device, such as a notification message of an instant messaging application, a sports application, or a shopping application.
  • the content of the pre-existing auxiliary image information viewed by the user from the visible area is different depending on the rotation amount of the user's head.
  • the amount of rotation turned is positively related.
  • the exit signal can be triggered to generate an image reset instruction.
  • the near-eye display device After the near-eye display device responds to the image reset instruction, the near-eye display device switches the displayed second image to display the first image.
  • Image for example, move and hide the power parameter, time parameter, date parameter, network parameter, volume parameter, audio playback parameter, notification message or brightness parameter currently displayed in the visible area, and display all of the first image in the visible area information to restore to the original state.
  • the exit signal can be the exit signal triggered after the touch button is reset.
  • the user needs to view the auxiliary image, he only needs to press the touch button with his finger, and then turn his head. Then you can view the auxiliary display information that needs to be viewed. After releasing the finger, it can return to the initial state, and the user can quickly view the information that needs to be viewed, improving the efficiency of viewing auxiliary display information.
  • the exit signal can also be other signals.
  • the trigger signal is a touch signal triggered by double-clicking a touch button
  • the exit signal can be an exit signal triggered by single-clicking a touch button.
  • the information display method provided by the embodiment of the present application pre-generates auxiliary image information that does not need to be viewed by the user in real time around the main image.
  • the visible area only displays the main image that the user needs to view.
  • the auxiliary image will not be displayed in front of the user's eyes in real time, which will interfere with the user's vision.
  • the display content of the main image in the visible area can also be increased to improve the display effect.
  • the user wants to view the hidden auxiliary image, he only needs to touch the temple, then turn his head, and see the hidden auxiliary image from the visible area, which not only has a sense of confirmation of physical touch on the operational level , at the same time, the action of moving the head to see the hidden content will also have a corresponding sense of space and vision, which is beneficial to reduce the user's cognitive load and the dizziness that may occur when reading near the eyes.
  • the auxiliary image viewing function can be exited, so that the near-eye display device returns to the state of initially displaying the main image, and the efficiency of viewing the hidden auxiliary image can be improved.
  • the near-eye display device may further include a processor and a storage medium, where the processor is electrically connected to the memory.
  • the processor is the control center of the near-eye display device. It uses various interfaces and lines to connect various parts of the entire near-eye display device, and executes the near-eye display device by running or loading computer programs stored in the memory and calling data stored in the memory. various functions and process data.
  • the memory can be used to store software programs and modules, and the processor executes various functional applications and data processing by running the computer programs and modules stored in the memory.
  • the memory can mainly include a program storage area and a data storage area, wherein the program storage area can store an operating system, computer programs required by at least one function (such as sound playback function, image playback function, etc.); Data created by the use of the device, etc.
  • the memory may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage devices.
  • the memory may also include a memory controller to provide processor access to the memory.
  • the processor in the near-eye display device will follow the steps below to load the instructions corresponding to the process of one or more computer programs into the memory, and the processor will run the computer program stored in the memory , so as to realize various functions, as follows:
  • a second image is displayed according to the pose information, the second image is part of a source image, and the second image is different from the first image.
  • the region of the second image in the source image intersects the region of the first image in the source image.
  • the second image includes a first sub-image area and a second sub-image area, the second sub-image area of the second image is disposed adjacent to the first sub-image area, and the first sub-image area located within the intersection area of the second image and the first image in the source image.
  • the second sub-image area of the second image is disposed adjacent to a side of an edge of the first image.
  • the near-eye display device includes a visual area, and when the second image is displayed according to the posture information, the processor is configured to perform:
  • a second image corresponding to the target area is displayed in the visible area.
  • the first image and the second image displayed in the viewable area have the same size.
  • the second image includes auxiliary display information
  • the auxiliary display information includes one of parameter information of the near-eye display device and/or parameter information of an electronic device connected to the near-eye display device or Various.
  • the parameter information includes one or more combinations of power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters.
  • the processor is further configured to:
  • the processor is further configured to execute:
  • a second image corresponding to the target area is displayed in the visible area.
  • the processor is further configured to perform:
  • the near-eye display device switches the displayed second image to display the first image.
  • the trigger signal of the image display update instruction includes at least one of a touch signal, a voice signal, an image signal, and an action signal.
  • the near-eye display device includes a wearing component, the wearing component is provided with a touch module, the touch module is used to receive the touch signal, and the touch module receives the image Displaying the touch signal of the update instruction, the near-eye display device acquires the posture information of the near-eye display device in response to the image display update instruction.
  • the near-eye display device 10 may also include: a camera, a microphone, a speaker, LED lights and other devices for function expansion (such as playing songs through the speaker, or taking picture pictures through the camera, etc., according to the picture taken by the camera.
  • the viewing area displays the interaction data related to the current picture), which increases the function and playability of the near-eye display device 10 .
  • the near-eye display device can perform data interaction with other electronic devices (electronic devices such as mobile phones, tablet computers, smart watches, and smart cars), and the near-eye display device 10 realizes corresponding functions according to instructions of other electronic devices.
  • FIG. 18 is a schematic flowchart of a third information display method provided by an embodiment of the present application.
  • FIG. 19 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • the embodiment of the present application also provides an information display method, which is applied to an electronic device, and the electronic device can be a smart phone, a tablet computer, a PDA (Personal Digital Assistant), a smart watch, a smart bracelet and other electronic devices, as shown in Figure 19 As shown, the electronic device can be a smart phone, and the information display methods include:
  • the electronic device can store the active image, and the source image can include the image information to be displayed by the near-eye display device.
  • the source image can also be generated in the server and sent to the electronic device through wired or wireless data transmission.
  • the first image may be the image information that the user currently needs to view.
  • the first image may include application information, communication information, audio information, and video information of the near-eye display device or an electronic device connected to the near-eye display device.
  • An image can also include other information, such as auxiliary display information that the user does not need to view in real time.
  • the auxiliary display information can be parameter information of the near-eye display device and/or parameter information of electronic devices connected to the near-eye display device, such as power parameters, network parameters, time parameters, audio playback parameters, display parameters, etc.
  • the position of the auxiliary display information in the source image may be around the position of the first image, such as being set on one side of the edge of the first image, or set around the edge of the first image.
  • Figure 4 is the first schematic diagram of the source image provided by the embodiment of the present application
  • Figure 5 is the second schematic diagram of the source image provided by the embodiment of the present application
  • Figure 6 is the source image provided by the embodiment of the present application
  • the first image includes application information related to the fitness application.
  • the auxiliary display information includes power parameters and time parameters
  • the auxiliary display information includes power parameters, Time parameters and network parameters.
  • the auxiliary display information includes power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters, where the audio playback parameters can include volume parameters and previous current audio application playback parameters.
  • display parameters may include display brightness parameters.
  • first image and auxiliary display information in the illustration are only exemplary.
  • the first image can also be the image information that the user currently needs to view according to actual needs, and the auxiliary display information can also be set according to actual needs. Information that does not need to be viewed in real time.
  • the image display update instruction may be generated according to a trigger signal
  • the trigger signal may be a trigger signal triggered by a user through a trigger module
  • the trigger signal may include at least one of a touch signal, a voice signal, an image signal and an action signal.
  • the touch module can be a touch sensor, and the touch sensor is used to obtain a touch signal, and the touch signal can be a pressing signal and/or a sliding signal.
  • the touch module can also be an audio collection sensor, such as a microphone, for collecting voice signals, and the voice signal can be an audio signal that meets preset conditions collected through the microphone.
  • the touch module can also use an image acquisition sensor, such as a camera, to collect image signals, and the image signals are image signals that meet preset conditions and are collected by the camera.
  • the touch module may be a motion collection sensor, such as a posture sensor, and the motion signal may be a motion signal collected by the posture sensor that satisfies a preset condition. It can be understood that other types of touch modules and corresponding trigger signals can also be set according to actual needs. If the above trigger signal is received, an image display update instruction is generated according to the trigger signal, the electronic device receives the image display update instruction, and in response to the image update instruction, the electronic device obtains the attitude information of the near-eye display device.
  • the near-eye display device may include an attitude sensor, and the attitude sensor may collect attitude information of the near-eye display device.
  • Fig. 8 is a second view of a rotating scene of the near-eye display device provided by the embodiment of the present application.
  • the attitude sensor 13 is used to detect the attitude information of the near-eye display device 10 .
  • the attitude sensor 13 may include a gyroscope, an electronic compass, an acceleration sensor and/or a Hall sensor.
  • the attitude sensor 13 can realize 3 degree of freedom detection (3degreerissaedom, 3DOF) or 6 degree of freedom detection (6degreeoffreedom, 6DOF) of the near-eye display device.
  • the near-eye display device can realize 3DOF as an example, and the near-eye display device can detect the first degree of freedom through the attitude sensor 13. Attitude information of one degree of freedom rotation, attitude information of second degree of freedom rotation and attitude information of third degree of freedom rotation. After the posture sensor collects the posture information of the near-eye display device, the posture information is sent to the electronic device, so that the electronic device obtains the posture information of the near-eye display device.
  • the electronic device determines the second image according to the acquired posture information of the near-eye display device, the second image belongs to a part of the source image, the second image is different from the first image, and sends the second image to the near-eye display device, so that the near-eye display device Display the second image.
  • FIG. 9 is a first display schematic diagram of the near-eye display device provided by the embodiment of the present application
  • FIG. 10 is a second display schematic diagram of the near-eye display device provided by the embodiment of the present application.
  • the second image is part of the source image, the second image is different from the first image, for example, as shown in Figure 10, the position of the second image in the source image is not the same as the position of the first image in the source image , the region of the second image in the source image intersects the region of the first image in the source image.
  • the first image includes fitness application information
  • the second image includes auxiliary display information.
  • the corresponding trigger signal can be triggered through the trigger module.
  • the attitude sensor collects the attitude information of the near-eye display device, and displays the battery parameters that the user needs to view according to the attitude information.
  • the second image may include application information and auxiliary display information of a part of the first image. Wherein, the intersection of the area of the first image in the source image and the area of the second image in the source image is a partial area corresponding to the application information.
  • the second image includes a first sub-image area and a second sub-image area
  • the second sub-image area of the second image is disposed adjacent to the first sub-image area
  • the first sub-image area is located between the second image and the second sub-image area.
  • An image is within the intersection region in the source image.
  • the first sub-image area may be an area including part of the application information
  • the second sub-image area may be an area including auxiliary display information
  • the first sub-image area is located between the second image and the first image.
  • the first sub-image area may completely overlap with the intersection area, or partially overlap with the intersection area.
  • the second sub-image area of the second image is set adjacent to one side of the edge of the first image, as shown in FIG. 10 , the second sub-image area of the second image is set adjacent to a side of the first image
  • the second sub-image area of the second image is set adjacent to two adjacent sides of the first image. As shown in Figure 11.
  • the near-eye display device before displaying the second image information according to the posture information, may not display the first image, for example, when the near-eye display device is in a low energy consumption mode or a low power state, the visual area is in a black screen state,
  • the posture information of the near-eye display device is collected by the posture sensor, and the second image is displayed according to the posture information, which can also satisfy the user's requirements for the near-eye display device when it is in a low energy consumption mode or a low power mode.
  • the reading of parameters increases the readability of auxiliary display information.
  • the information display method provided by the embodiment of the present application uses the electronic device to display the first image that is part of the source image on the near-eye display device in advance, and obtains the posture information of the near-eye display device after responding to the image display update instruction, and according to the posture information, the image that is part of the source image
  • the second image shows that the second image may include some information that does not need to be viewed in real time, and is displayed according to the posture information of the near-eye display device when viewing is required, the display content is highly flexible, and the display effect of the near-eye display device is improved.
  • the determining the second image according to the pose information may include: determining a target area from the source image according to the pose information; sending the second image to a near-eye display device includes: The near-eye display device includes a visible area, and the second image corresponding to the target area is sent to the near-eye display device, so that the near-eye display device displays the second image corresponding to the target area in the visible area. image.
  • the first image and the second image displayed in the viewable area have the same size.
  • the second image includes auxiliary display information
  • the auxiliary display information includes one or more of parameter information of the electronic device and/or parameter information of a near-eye display device connected to the electronic device.
  • the parameter information includes one or more combinations of power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters.
  • Acquiring the posture information of the near-eye display device, determining a second image according to the posture information, and sending the second image to the near-eye display device includes:
  • the manner in which the electronic device determines the target area is similar to the manner in which the near-eye device determines the target area, and details are not repeated here.
  • the information display method after sending the second image to the near-eye display device, the information display method further includes:
  • the first image is sent to the near-eye display device.
  • the manner in which the electronic device responds to the image display reset instruction is similar to that of the near-eye display device in response to the image reset instruction, and will not be repeated here.
  • the processor in the electronic device will load instructions corresponding to the process of one or more computer programs into the memory according to the following steps, and the processor will run the computer program stored in the memory, To achieve various functions, as follows:
  • the second image is a part of the source image, and the second image is different from the first image
  • the second image is sent to a near-eye display device.
  • the region of the second image in the source image intersects the region of the first image in the source image.
  • the second image includes a first sub-image area and a second sub-image area, the second sub-image area of the second image is disposed adjacent to the first sub-image area, and the first sub-image area located within the intersection area of the second image and the first image in the source image.
  • the second sub-image area of the second image is disposed adjacent to a side of an edge of the first image.
  • the processor when the second image is determined according to the posture information, the processor is further configured to perform:
  • Sending the second image to the near-eye display device includes:
  • the near-eye display device includes a visible area, and the second image corresponding to the target area is sent to the near-eye display device, so that the near-eye display device displays the second image corresponding to the target area in the visible area. image.
  • the first image and the second image displayed in the viewable area have the same size.
  • the second image includes auxiliary display information
  • the auxiliary display information includes one or more of parameter information of the electronic device and/or parameter information of a near-eye display device connected to the electronic device.
  • the parameter information includes one or more combinations of power parameters, network parameters, time parameters, audio playback parameters, display parameters, and notification message parameters.
  • the processor after sending the first image to the near-eye display device, the processor is further configured to: acquire coordinate information of the first image displayed in the visible area on the source image;
  • the processor is further configured to execute:
  • the processor is further configured to: send the first image to the near-eye display device in response to a received image display reset instruction display screen.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种信息显示方法、近眼显示设备以及电子设备,方法包括显示第一图像,第一图像是源图像的一部分;响应于图像显示更新指令,获取近眼显示设备的姿态信息;根据姿态信息显示第二图像,第二图像属于源图像的一部分,第二图像与第一图像不同。第二图像可以包括一些无需实时查看的信息,在需要查看时显示,显示内容的灵活度高。

Description

信息显示方法、近眼显示设备以及电子设备
本申请要求于2021年09月16日提交中国专利局,申请号为202111088751.X发明名称为“信息显示方法、近眼显示设备以及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及近眼显示设备技术领域,特别涉及一种信息显示方法、近眼显示设备以及电子设备。
背景技术
近眼显示设备例如智能眼镜,作为一种将最新的IT技术与传统眼镜的功能相结合,以其便于携带、易于使用、功能丰富等优点越来越受消费者的欢迎。
发明内容
本申请实施例提供一种信息显示方法、近眼显示设备以及电子设备,可以提高近眼显示设备的显示效果。
本申请实施例提供一种信息显示方法,应用于近眼显示设备,所述信息显示方法包括:
显示第一图像,所述第一图像是源图像的一部分;
响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
本申请还提供一种信息显示方法,应用于电子设备,存储有源图像,所述方法包括:
将第一图像发送至近眼显示设备,所述第一图像属于源图像的一部分;
响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
根据所述姿态信息确定出第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同;
将所述第二图像发送至近眼显示设备。
本申请还提供一种近眼显示设备,所述近眼显示设备用于执行如上所述的信息显示方法。
本申请还提供一种近眼显示设备,所述近眼显示设备包括:
显示装置,用于显示第一图像,所述第一图像是源图像的一部分;
触控模组,用于响应于图像显示更新指令;
姿态传感器,用于获取所述近眼显示设备的姿态信息;
其中,显示装置还用于根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
本申请还提供一种电子设备,所述电子设备用于执行如上所述的信息显示方法。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍。显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本申请实施例提供的近眼显示设备的第一结构示意图。
图2为本申请实施例提供的近眼显示设备的第二结构示意图。
图3为本申请实施例提供的信息显示方法的第一流程示意图。
图4为本申请实施例提供的源图像的第一示意图。
图5为本申请实施例提供的源图像的第二示意图。
图6为本申请实施例提供的源图像的第三示意图。
图7为本申请实施例提供的近眼显示设备的转动场景第一视图。
图8为本申请实施例提供的近眼显示设备的转动场景第二视图。
图9为本申请实施例提供的近眼显示设备的第一显示示意图。
图10为本申请实施例提供的近眼显示设备的第二显示示意图。
图11为本申请实施例提供的近眼显示设备的第三显示示意图
图12为本申请实施例提供的信息显示方法的第二流程示意图。
图13为本申请实施例提供的近眼显示设备的第一应用场景示意图。
图14为本申请实施例提供的近眼显示设备的第二应用场景示意图。
图15为本申请实施例提供的近眼显示设备的第三应用场景示意图。
图16为本申请实施例提供的近眼显示设备的第四应用场景示意图。
图17为本申请实施例提供的近眼显示设备的第五应用场景示意图。
图18为本申请实施例提供的信息显示方法的第三流程示意图。
图19为本申请实施例提供的电子设备的结构示意图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述。显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供一种信息显示方法,应用于近眼显示设备,所述方法包括:
显示第一图像,所述第一图像是源图像的一部分;
响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
本申请的一种可选实施例中,所述第二图像在所述源图像中的区域与所述第一图像在所述源图像中的区域有交集。
本申请的一种可选实施例中,所述第二图像包括第一子图像区域和第二子图像区域,所述第二图像的第二子图像区域邻近第一子图像区域设置,所述第一子图像区域位于所述第二图像与所述第一图像在所述源图像中的交集区域内。
本申请的一种可选实施例中,所述第二图像的第二子图像区域邻近第一图像边缘一侧设置。
本申请的一种可选实施例中,所述近眼显示设备包括可视区,所述根据所述姿态信息显示第二图像包括:
根据所述姿态信息从所述源图像中确定出目标区域;
在所述可视区显示所述目标区域对应的第二图像。
本申请的一种可选实施例中,在所述可视区显示的第一图像和第二图像的尺寸大小相同。
本申请的一种可选实施例中,所述第二图像包括辅助显示信息,所述辅助显示信息包括所述近眼显示设备的参数信息和/或所述近眼显示设备连接的电子设备的参数信息中的一种或多种。
本申请的一种可选实施例中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
本申请的一种可选实施例中,在显示第一图像之后,还包括:
获取所述可视区显示的第一图像在所述源图像上的坐标信息;
获取所述近眼显示设备的姿态信息,根据所述姿态信息显示所述第二图像步骤具体包括:
获取预设时间段内近眼显示设备的姿态信息;
根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
根据所述位置变化信息以及所述坐标信息从源图像中确定出目标区域;
在所述可视区显示所述目标区域对应的第二图像。
本申请的一种可选实施例中,在根据所述姿态信息显示所述第二图像之后,所述信息显示方法还包括:
响应于接收到的图像显示复位指令,所述近眼显示设备将显示的第二图像切换为显示第一图像。
本申请的一种可选实施例中,所述图像显示更新指令的触发信号包括触控信号、语音信号、图像信号以及动作信号中的至少一种。
本申请的一种可选实施例中,所述近眼显示设备包括佩戴组件,所述佩戴组件设置有触控模组,所述触控模组用于接收所述触控信号,所述触控模组接收到图像显示更新指令的触控信号,则所述近眼显示设备响应于所述图像显示更新指令,获取所述近眼显示设备的姿态信息。
本申请实施例还提供一种信息显示方法,应用于电子设备,存储有源图像,所述方法包括:
将第一图像发送至近眼显示设备,所述第一图像属于源图像的一部分;
响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
根据所述姿态信息确定出第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同;
将所述第二图像发送至近眼显示设备。
本申请的一种可选实施例中,所述第二图像在所述源图像中的区域与所述第一图像在所述源图像中的区域有交集。
本申请的一种可选实施例中,所述第二图像包括第一子图像区域和第二子图像区域,所述第二图像的第二子图像区域邻近第一子图像区域设置,所述第一子图像区域位于所述第二图像与所述第一图像在所述源图像中的交集区域内。
本申请的一种可选实施例中,所述第二图像的第二子图像区域邻近第一图像边缘一侧设置。
本申请的一种可选实施例中,所述根据所述姿态信息确定出第二图像包括:
根据所述姿态信息从所述源图像中确定出目标区域;
将所述第二图像发送至近眼显示设备包括:
所述近眼显示设备包括可视区,将所述目标区域对应的第二图像发送至所述近眼显示设备,以使所述近眼显示设备在所述可视区显示所述目标区域对应的第二图像。
本申请的一种可选实施例中,在所述可视区显示的第一图像和第二图像的尺寸大小相同。
本申请的一种可选实施例中,所述第二图像包括辅助显示信息,所述辅助显示信息包括电子设备的参数信息和/或所述电子设备连接的近眼显示设备的参数信息中的一种或多种。
本申请的一种可选实施例中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
本申请的一种可选实施例中,在将第一图像发送至近眼显示设备之后,还包括:
获取所述可视区显示的第一图像在所述源图像上的坐标信息;
获取所述近眼显示设备的姿态信息,根据所述姿态信息确定出第二图像,将所述第二图像发送至近眼显示设备包括:
获取预设时间段内近眼显示设备的姿态信息;
根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
根据所述位置变化信息以及所述坐标信息从所述源图像中确定出目标区域;
将所述目标区域对应的第二图像发送至所述近眼显示设备。
本申请的一种可选实施例中,在将所述第二图像发送至所述近眼显示设备之后,所述信息显示方法还包括:
响应于接收到的图像显示复位指令,将所述第一图像发送至所述近眼显示设备。
本申请实施例还提供一种近眼显示设备,所述近眼显示设备用于执行如上所述的信息显示方法。
本申请实施例还提供一种近眼显示设备,所述近眼显示设备包括:
显示装置,用于显示第一图像,所述第一图像是源图像的一部分;
触控模组,用于响应于图像显示更新指令;
姿态传感器,用于获取所述近眼显示设备的姿态信息;
其中,显示装置还用于根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
本申请的一种可选实施例中,近眼显示设备还包括:
佩戴组件,用于近眼显示设备的佩戴;
其中,显示装置包括可视区,所述可视区设置于所述佩戴组件,所述可视区用于显示所述第二图像,触发模组包括触控按键,设置于所述佩戴组件,用于接收所述图像显示更新指令的触发信号。
本申请实施例还提供一种电子设备,所述电子设备用于执行如上所述的信息显示方法。
本申请实施例提供一种近眼显示设备,近眼显示设备可以为智能近眼显示设备,诸如智能眼镜、智能头套、智能头盔等智能近眼显示设备,智能眼镜可以为AR(Augmented Reality)眼镜或VR(Virtual Reality)眼镜等,近眼显示设备通常包括一些电子器件诸如电源、显示装置、佩戴组件以及传感器等。近眼显示设备可以根据用户的操作实现预设功能,预设功能比如为通过显示装置显示画面等。
以近眼显示设备为AR智能眼镜示例,如图1所示,图1为本申请实施例提供的近眼显示设备的第一结构示意图。
近眼显示设备10可以包括佩戴组件11以及显示装置12,其中,显示装置12可以包括显示器121和投影机124,显示器121可以为镜片,镜片可以是具有光栅结构的波导镜片,也可以为平光眼镜镜片、太阳镜镜片、处方眼镜镜片、处方太阳镜镜片等其他形式,处方眼镜镜片或处方太阳镜镜片指的是根据处方单(或者说验光单)所配备的眼镜镜片或太阳镜镜片。投影机124可以通过不同的显示方式使显示器121显示图像信息,例如光纤扫描方式(Fiber Scanning Display,FSD)、数字光处理方式(Digital Light Processing,DLP)、激光扫描方式(Laser Beam Scanning,LBS)等,投影机124可以固定设置于佩戴组件11,也可以与佩戴组件11可拆卸连接,在一些实施例中,显示器121和投影机124可以集成为一体,集成一体的显示装置12可以固定设置于佩戴组件11,也可以与佩戴组件11可拆卸连接,用户可直接通过集成一体的显示装置12直接观看到图像信息而无需镜片。
佩戴组件11可以作为近眼显示设备10的框架结构,佩戴组件11还可以包括镜框111和镜腿112,镜框111可以设置有如上所述的显示器121(镜片),以显示图像信息,在一些实施例中,也可以不设置镜片,或者只是设置普通的镜片,通过集成一体的显示装置显示图像信息,用户可以通过镜腿112将显示装置佩戴至用户头部,并直接观看到显示装置所显示的图像信息而无需通过镜片显示。
镜腿112分别连接在镜框111的相对两侧,用户可以通过镜腿112将近眼显示设备10佩戴到用户的头部,在一些实施例中,也可以通过其他佩戴组件将近眼显示设备佩戴到用户的头部,例如具有弹力、磁力或粘接力的连接带或连接扣。
近眼显示设备包括可视区,请参阅图2,图2为本申请实施例提供的近眼显示设备的第二结构示意图。其中,可视区122为用户眼镜可看到显示图像信息的区域,与近眼显示设备的视场角(Field of view,FOV)相关,近眼显示设备的显示装置的FOV越大,用户可以看到的显示范围就越大。目前近眼显示设备如AR眼镜的显示装置的FOV可以为20°、40°、60°、80°等。显示装置的FOV的大小与显示器和投影机的结构有关,如显示器为波导镜片时显示装置的FOV与波导镜片的结构有关,目前,显示装置12用于显示图像信息的可视区面积有限,无法满足用户对于近眼显示设备查看信息的需求,基于此,本申请提供一种信息显示方法,可以提高近眼显示设备可视区的显示效果,请继续参阅图3,图3为本申请实施例提供的信息显示方法的第一流程示意图。应用于近眼显示设备,信息显示方法包括:
201,显示第一图像,所述第一图像是源图像的一部分。
近眼显示设备可以存储有源图像,源图像可以包括显示装置待显示的图像信息,在一些实施例中,源图像也可以在服务器或与近眼显示设备连接的其他电子设备内生成,通过有线或无线传输的数据传输方式发送至近眼显示设备。其他电子设备可以是智能手机、平板电脑、掌上电脑(PDA,Personal Digital Assistant)、智能手表、智能手环等电子设备。
第一图像可以为用户当前需要查看的图像信息,例如,第一图像可以包括近眼显示设备或与近眼显示设备连接的电子设备的应用信息、通讯信息、音频信息以及视频信息等,源图像除了第一图像外还可以包括其他信息,诸如用户无需实时查看的辅助显示信息,辅助显示信息可以是近眼显示设备的参数信息和/或与近眼显示设备连接的电子设备的参数信息,诸如电量参数、网络参数、时间参数、音频播放参数以及显示参数等。辅助显示信息在源图像的位置可以在第一图像的位置周围,如设置在第一图像边缘的一侧,或者围绕第一图像边缘设置。
如图4至图6所示,图4为本申请实施例提供的源图像的第一示意图,图5为本申请实施例提供的源图像的第二示意图,图6为本申请实施例提供的源图像的第三示意图。图4至图6示例中,第一图像包括健身应用相关的应用信息,不同的是,图4示例中,辅助显示信息包括电量参数以及时间参数,图5示例中,辅助显示信息包括电量参数、时间参数以及网络参数,图6示例中,辅助显示信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数,其中,音频播放参数可以包括音量参数以及前当前音频应用播放状态参数,显示参数可以包括显示亮度参数。可以理解的是,图示中的第一图像和辅助显示信息只是示例性,第一图像还可以为根据实际需求设置的用户当前需要查看的图像信息,辅助显示信息还可以为根据实际需求设置用户无需实时查看的信息。
202,响应于图像显示更新指令,获取近眼显示设备的姿态信息。
其中,图像显示更新指令可以为根据触发信号生成的,其中,触发信号可以为用户通过触发模组所触发的触发信号,触发信号可以包括触控信号、语音信号、图像信号和动作信号中的至少一种,其中,触控模组可以为触控传感器,触控传感器用于获取触控信号,触控信号可以为按压信号和/或滑动信号。触控模块还可以为音频采集传感器,例如麦克风,用于采集语音信号,语音信号可以为通过麦克风采集到的满足预设条件的音频信号。触控模组还可以图像采集传感器,例如摄像头,用于采集图像信号,图像信号为通过摄像头采集到的满足预设条件的图像信号。触控模组可以为动作采集传感器,例如姿态传感器,动作信号可以为通过姿态传感器采集到的满足预设条件的动作信号。可以理解的是,还可以根据实际需求设置其他类型的触控模组以及相应的触发信号。若接收上述触发信号,则根据该触发信号生成图像显示更新指令,获取近眼显示设备的姿态信息。
近眼显示设备可以包括姿态传感器,姿态传感器可以采集近眼显示设备的姿态信息,请参阅图1、图7以及图8,图7为本申请实施例提供的近眼显示设备的转动场景第一视图。图8为本申请实施例提供的近眼显示设备的转动场景第二视图。姿态传感器13用于检测近眼显示设备10的姿态信息。其中,姿态传感器13可以包括陀螺仪、电子罗盘、加速度传感器和/或霍尔传感器。姿态传感器13可以实现近眼显示设备的3个自由度检测(3degreeoffreedom,3DOF)或6个自由度检测(6degreeoffreedom,6DOF),以近眼显示设备可实现3DOF示例,近眼显示设备可以通过姿态传感器13检测第一自由度 转动的姿态信息、第二自由度转动的姿态信息以及第三自由度转动的姿态信息。
203,根据姿态信息显示第二图像,第二图像属于源图像的一部分,第二图像与第一图像不同。第二图像可以包括上述辅助显示信息。
在一些实施例中,第二图像在源图像中的区域与第一图像在源图像中的区域有交集。请参阅图9和图10,图9为本申请实施例提供的近眼显示设备的第一显示示意图,图10为本申请实施例提供的近眼显示设备的第二显示示意图。在未响应图像显示更新指令之前,如图9所示,用户可以从近眼显示设备观察到的是第一图像,在响应于图像显示更新指令后,用户可以从近眼显示设备观察到的是第二图像,第二图像属于源图像的一部分,第二图像与第一图像不同,例如,如图10所示,第二图像在源图像中的位置与第一图像在源图像中的位置并不相同,第二图像在源图像中的区域与第一图像在源图像中的区域有交集。第一图像包括了健身应用的信息,第二图像包括了辅助显示信息,在实际应用场景中,当用户需要查看辅助显示信息中的电量参数时,可以通过触发模组触发相应的触发信号,通过姿态传感器采集近眼显示设备的姿态信息,根据姿态信息将用户需要查看的电量参数显示,此时,第二图像可以包括第一图像的一部分的应用信息和辅助显示信息。其中,第一图像在源图像的区域和第二图像在源图像的区域的交集部分为应用信息对应的部分区域。
在一些实施例中,第二图像包括第一子图像区域和第二子图像区域,第二图像的第二子图像区域邻近第一子图像区域设置,第一子图像区域位于第二图像与第一图像在源图像中的交集区域内。如图9和图10所示,第一子图像区域可以为包括部分应用信息的区域,第二子图像区域可以为包括辅助显示信息的区域,第一子图像区域位于第二图像与第一图像在源图像中的交集区域内,当然,第一子图像区域可以该交集区域完全重合,也可以与该交集区域部分重合。
在一些实施例中,第二图像的第二子图像区域邻近第一图像边缘一侧设置,如图10所示,第二图像的第二子图像区域与第一图像的一条侧边相邻设置,当然,在其他一些实施例中,第二图像的第二子图像区域与第一图像相邻的两条侧边相邻设置。如图11所示。
在一些实施例中,在根据姿态信息显示第二图像信息之前,近眼显示设备可以不显示第一图像,例如,当近眼显示设备处于低能耗模式或低电量状态时,可视区为黑屏状态,当用户通过触发信号生成图像显示更新指令时,通过姿态传感器采集近眼显示设备的姿态信息,根据姿态信息将第二图像显示,在处于低能耗模式或低电量模式时还可以满足用户对于近眼显示设备参数的读取,增加辅助显示信息的可读性。
本申请实施例提供的信息显示方法通过预先在近眼显示设备显示属于源图像一部分的第一图像,响应图像显示更新指令后获取近眼显示设备的姿态信息,根据姿态信息将属于源图像一部分的第二图像显示,第二图像可以包括一些无需实时查看的信息,在需要查看时根据近眼显示设备的姿态信息显示,显示内容的灵活度高,提高了近眼显示设备的显示效果。
请继续参阅图12,图12为本申请实施例提供的信息显示方法的第二流程示意图,信息同步方法包括:
301,显示第一图像,第一图像是源图像的一部分。
近眼显示设备可以存储有源图像,源图像可以包括显示装置待显示的图像信息,在一些实施例中,源图像也可以在服务器或与近眼显示设备连接的其他电子设备内生成,通过有线或无线传输的数据传输方式发送至近眼显示设备。其他电子设备可以是智能手机、平板电脑、掌上电脑(PDA,Personal Digital Assistant)、智能手表、智能手环等电子设备。
第一图像可以为用户当前需要查看的图像信息,例如,第一图像可以包括近眼显示设备或与近眼显示设备连接的电子设备的应用信息、通讯信息、音频信息以及视频信息等,源图像除了第一图像外还可以包括其他信息,诸如用户无需实时查看的辅助显示信息,辅助显示信息可以是近眼显示设备的参数信息和/或与近眼显示设备连接的电子设备的参数信息,诸如电量参数、网络参数、时间参数、音频播放参数以及显示参数等。辅助显示信息在源图像的位置可以在第一图像的位置周围,如设置在第一图像边缘的一侧,或者围绕第一图像边缘设置。
如图4至图6所示,图4为本申请实施例提供的源图像的第一示意图,图5为本申请实施例提供的源图像的第二示意图,图6为本申请实施例提供的源图像的第三示意图。图4至图6示例中,第一图像包括健身应用相关的应用信息,不同的是,图4示例中,辅助显示信息包括电量参数以及时间参数,图5示例中,辅助显示信息包括电量参数、时间参数以及网络参数,图6示例中,辅助显示信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数,其中,音频播放参数可以包括音量参数以及前当前音频应用播放状态参数,显示参数可以包括显示亮度参数。可以理解的是,图示中的第一图像和辅助显示信息只是示例性,第一图像还可以为根据实际需求设置的用户当前需要查看的图像信息,辅助显示信息还可以为根据实际需求设置用户无需实时查看的信息。
以源图像如图6示例中的源图像示例,源图像可以包括健身应用的相关信息以及辅助显示信息,辅助显示信息可以包括与健身应用顶部相邻的电量参数、时间参数以及网络参数,与健身应用底部相邻的音乐播放参数以及通知消息参数,与健身应用左侧相邻的音量参数,与健身应用右侧相邻的亮度参数。将用户无需实时查看的参数隐藏,在图像显示更新指令的触发信号没有被触发之前只在可视区中显示用户需要实时查看的健身应用信息。
在一些实施例中,辅助显示信息还可以为与第一图像显示图像相关的图像信息,例如,第一图像包括应用信息,辅助显示信息可以包括与该应用信息相关的信息。例如,第一图像信息包括健身应用的主要信息,辅助显示信息可以为与健身应用相关的信息,示例性的,第一图像包括健身应用的图标、当前健身项目、消耗卡路里以及心率信息等,辅助显示信息可以包括健身时长和当前健身项目的具体信息(当前健身项目为跑步,当前健身项目的具体信息为跑步公里数和步数等),可以理解的是,辅助显示信息可以根据主图像的具体内容生成。
在一些实施例中,辅助显示信息还可以包括当前近眼显示设备采集到的外界图像信号相关的图像信息,近眼显示设备还可以包括摄像头,摄像头用于采集外界图像信息以实现交互功能,用户开启了交互功能后生成的第一图像可以为摄像头采集到的图像信息,辅助显示信息可以为基于第一图像内容识别得到的识别后的信息,示例性的,用户开启近眼显示设备的摄像头实时采集外界的图像信息,对图像信息进行图文识别、人脸识别、物体识别或场景识别等识别方式,得到识别后的信息,将识别后的信息作为辅助显示信息设置于第一图像的周围,当用户需要查看时,可以通过触发图像显示指令的触发信号查看。
可以理解的是,图示中第一图像和辅助显示信息只是示例性,第一图像可以为根据实际需求设置的用户当前需要查看的图像信息,辅助显示信息可以为根据实际需求设置用户无需实时查看的信息。
302,响应于图像显示更新指令,根据姿态信息从源图像中确定出目标区域。
其中,图像显示更新指令可以为根据触发信号生成的,其中,触发信号可以为用户通过触发模组所触发的触发信号,触发信号可以包括触控信号、语音信号、图像信号和动作信号中的至少一种,其中,触控模组可以为触控传感器,触控传感器用于获取触控信号,触控信号可以为按压信号和/或滑动信号。触控模块还可以为音频采集传感器,例如麦克风,用于采集语音信号,语音信号可以为通过麦克风采集到的满足预设条件的音频信号。触控模组还可以图像采集传感器,例如摄像头,用于采集图像信号,图像信号为通过摄像头采集到的满足预设条件的图像信号。触控模组可以为动作采集传感器,例如姿态传感器,动作信号可以为通过姿态传感器采集到的满足预设条件的动作信号。可以理解的是,还可以根据实际需求设置其他类型的触控模组以及相应的触发信号。若接收上述触发信号,则根据该触发信号生成图像显示更新指令,获取近眼显示设备的姿态信息。
如图7所示,触发模组可以为设置于佩戴组件11的触控按键14,触控按键包括触控传感器,触控按键14用于接收用户触发的触控信号,用户可以通过按压、触碰、靠近和/或滑动该触控按键触发相应的触发信号。
可以理解的是,还可以根据实际需求设置其他类型的触控模组以及相应的触发信号。若接收上述触发信号,则生成图像显示更新指令,以使近眼显示设备向应用图像显示更新指令,获取近眼显示设备的姿态信息。
近眼显示设备可以包括姿态传感器,姿态传感器采集近眼显示设备的姿态信息,请参阅图1、图7以及图8,图7为本申请实施例提供的近眼显示设备的转动场景第一视图。图8为本申请实施例提供的近眼显示设备转动场景第二视图。姿态传感器13用于检测近眼显示设备10的姿态信息。其中,姿态传感器13可以包括陀螺仪、电子罗盘、加速度传感器和/或霍尔传感器。姿态传感器13可以实现近眼显示设备的3个自由度检测(3degreeoffreedom,3DOF)或6个自由度检测(6degreeoffreedom,6DOF),以近眼显示设备可实现3DOF示例,近眼显示设备可以通过姿态传感器13检测第一自由度转动的姿态信息、第二自由度转动的姿态信息以及第三自由度转动的姿态信息。
其中,姿态传感器在预设时间段内采集到的姿态信息经过转换可以得到近眼显示设备的位置变化信息,位置变化信息可以为近眼显示设备在空间中的位置变化信息,由于近眼显示设备的可视区随着近眼显示设备的位置的变化而变化,近眼显示设备的位置信息与近眼显示设备的可视区的位置信息之间存在对应的关系,根据近眼显示设备的位置信息可以得到可视区在空间中的位置信息。当近眼显示设备移动时,可视区在空间中的位置发生变化,根据可视区位置变化信息从源图像中确定出目标区域。
例如,用户佩戴近眼显示设备转动头部时,近眼显示设备在空间中的位置发生变化,转化为可视区位置变化量,根据位置变化量从源图像中确定出目标区域,目标区域可以包括第一图像的一部分和辅助显示信息。可通过上述的投影机将目标区域对应的第二图像投影至可视区显示,用户可通过改变近眼显示设备的位置在可视区查看到辅助图像的信息。
在一些实施例中,可通过以下方式确定出目标区域:
获取可视区显示的第一图像在源图像上的坐标信息;
获取预设时间段内近眼显示设备的姿态信息;
根据预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
根据位置变化信息以及坐标信息从源图像中确定出目标区域;
例如,可视区当前显示的第一图像全部的图像信息,获取第一图像全部的图像信息在源图像上的坐标信息,坐标信息可以为根据可视区大小在源图像上生成的目标框的坐标信息,根据姿态信息计算得到近眼显示设备的位置变化量,根据近眼显示设备的位置变化量得到对应的可视区的位置变化量,根据可视区的位置变化量以及目标框的坐标信息计算得到目标区域的坐标信息,根据目标区域的坐标信息确定出目标区域的图像,目标区域的包括第一图像在源图像的区域的一部分和辅助显示信息在源图像的区域的至少一部分。
303,在可视区显示目标区域对应的第二图像。
其中,第二图像属于源图像的一部分,第二图像与第一图像不同。第二图像可以包括上述辅助显示信息,在一些实施例中,第二图像在源图像中的区域与第一图像在源图像中的区域有交集。请参阅图9和图10,图9为本申请实施例提供的近眼显示设备的第一显示示意图,图10为本申请实施例提供的近眼显示设备的第二显示示意图。在未响应图像显示更新指令之前,如图9所示,用户可以从近眼显示设备观察到的是第一图像,在响应于图像显示更新指令后,用户可以从近眼显示设备观察到的是第二图像,第二图像属于源图像的一部分,第二图像与第一图像不同,例如,如图10所示,第二图像在源图像中的位置与第一图像在源图像中的位置并不相同,第二图像在源图像中的区域与第一图像在源图像中的区域有交集。第一图像包括了健身应用的信息,第二图像包括了辅助显示信息,在实际应用场景中,当用户需要查看辅助显示信息中的电量参数时,可以通过触发模组触发相应的触发信号,通过姿态传感器采集近眼显示设备的姿态信息,根据姿态信息将用户需要查看的电量参数显示,此时,第二图像可以包括第一图像的一部分的应用信息和辅助显示信息。其中,第一图像在源图像的区域和第二图像在源图像的区域的交集部分为应用信息对应的部分区域。
在一些实施例中,第二图像包括第一子图像区域和第二子图像区域,第二图像的第二子图像区域邻近第一子图像区域设置,第一子图像区域位于第二图像与第一图像在源图像中的交集区域内。如图9和图10所示,第一子图像区域可以为包括部分应用信息的区域,第二子图像区域可以为包括辅助显示信息的区域,第一子图像区域位于第二图像与第一图像在源图像中的交集区域内,当然,第一子图像区域可以该交集区域完全重合,也可以与该交集区域部分重合。
在一些实施例中,第二图像的第二子图像区域邻近第一图像边缘一侧设置,如图10所示,第二图像的第二子图像区域与第一图像的一条侧边相邻设置,当然,在其他一些实施例中,第二图像的第二子图像区域与第一图像相邻的两条侧边相邻设置。如图11所示。
在一些实施例中,在可视区显示的第一图像和第二图像的尺寸大小相同。例如,在可视区显示的第一图像和第二图像的分辨率、亮度、透明度等图像参数相同,又例如,在可视区显示的第一图像在源图像上的区域大小与第二图像在源图像上的区域大小相同,又例如,第一图像和第二图像在可视区显示的位置和/或显示大小相同。当然,为了显示的多元化,第二图像还可以根据用户喜好调整在可视区显示的方式,即在可视区显示的第一图像和第二图像的尺寸大小也可以不同。
在实际应用场景中,请继续参阅图1、图7、图13至图17,图13为本申请实施例提供的近眼显示设备的第一应用场景示意图,图14为本申请实施例提供的近眼显示设备的第二应用场景示意图,图15为本申请实施例提供的近眼显示设备的第三应用场景示意图。图16为本申请实施例提供的近眼显示设备的第四应用场景示意图。图17为本申请实施例提供的 近眼显示设备的第五应用场景示意图。
如图13所示,用户佩戴好近眼显示设备且近眼显示设备10启动后,近眼显示设备显示第一图像,第一图像可以包括当前查看的应用信息等,第一图像为源图像的一部分,源图像除了包括第一图像对应的应用信息外,还可以包括辅助显示信息,辅助显示信息包括与第一图像顶部相邻的电量参数、时间参数以及网络参数,与第一图像底部相邻的音乐播放参数以及通知消息参数,与第一图像左侧相邻的音量参数,与第一图像右侧相邻的亮度参数。将用户无需实时查看的参数隐藏,在图像显示更新指令被响应前,用户可在可视区122显示第一图像的全部信息,第一图像可以包括用户需要实时查看的应用信息,如健身应用相关的信息。
当用户通过触发模组触发相关的触发信号后,近眼显示设备响应于触发信号对应的图像更新指令,其中,用户可通过如图7所示的触发方式触发相应的触发信号,具体的,触发模组可以为设置于佩戴组件11的触控按键14,触控按键包括触控传感器,触控按键14用于接收用户触发的触控信号,用户可以通过手指按压该触控按键触发按压信号,以触发图像显示更新指令对应的触发信号。
用户在触发图像显示更新指令对应的触发信号后,用户可以转动头部,以使近眼显示设备10通过姿态传感器采集姿态信息,用户可以向左转动头部,如图14所示,用户向左转动头部后,近眼显示设备根据相应的姿态信息从源图像中确定出目标区域,目标区域包括辅助显示信息的音量参数以及第一图像健身应用的部分信息,将目标区域对应的音量参数以及健身应用的部分信息的图像信息在可视区显示,将辅助显示信息对应的音量参数在可视区显示,将原先在可视区显示的健身应用相关的应用信息的一部分隐藏,用户可以通过转动近眼显示设备查看到预先存在于源图像的但不可见的音量参数,其中,音量参数可以为近眼显示设备或与近眼显示设备连接的其他电子设备当前播放音频文件的音量参数,也可以为近眼显示设备或与近眼显示设备连接的其他电子设备处于通话状态时的音量参数。
用户可以向右转动头部,如图15所示,用户向右转动头部后,近眼显示设备根据相应的姿态信息从源图像中确定出目标区域,目标区域包括辅助显示信息的亮度参数以及第一图像健身应用的部分信息,将目标区域对应的亮度参数以及第一图像健身应用的部分信息的图像信息在可视区显示,将原先在可视区显示的健身应用相关的应用信息的一部分隐藏,用户可以通过转动近眼显示设备查看到预先存在但不可见的亮度参数。其中,亮度参数可以为近眼显示设备可视区显示画面的亮度参数。
用户可以向上转动头部,如图16所示,用户向上转动头部后,近眼显示设备根据相应的姿态信息从源图像中确定出目标区域,目标区域包括辅助显示信息的电量参数、时间参数以及网络参数以及第一图像健身应用的部分信息,将目标区域对应的电量参数、时间参数以及网络参数以第一图像健身应用的部分信息的图像信息在可视区显示,将原先在可视区显示的健身应用相关的应用信息的一部分隐藏,用户可以通过转动近眼显示设备查看到预先存在但不可见的电量参数、时间参数以及网络参数。其中,电量参数可以为近眼显示设备的电量参数,网络信息可以为近眼显示设备连接的网络信息,如信号强度和网络类型。
用户可以向下转动头部,如图17所示,用户向下转动头部后,近眼显示设备根据相应的姿态信息从源图像中确定出目标区域,目标区域包括辅助图像信息的音频播放参数和通知消息以及第一图像健身应用的部分信息,将目标区域对应的音频播放参数和通知消息以及第一图像健身应用的部分信息的图像信息在可视区显示,将原先在可视区显示的健身应用相关的应用信息的一部分隐藏,用户可以通过转动近眼显示设备查看到预先存在但不可见的音频播放参数和通知消息。其中,通知信息可以为近眼显示设备接收到的应用通知消息,例如即时通讯应用、运动应用或购物应用的通知消息。
需要说明的是,用户头部转动的转动量不同从可视区查看到预先存在的辅助图像信息的内容多少也不同,用户从可视区查看到预先存在的辅助图像信息的内容与用户头部转动的转动量正相关。
在一些实施例中,在用户查看需要查看的消息后,可以触发退出信号,以生成图像复位指令,近眼显示设备响应于图像复位指令后,近眼显示设备将显示的第二图像切换为显示第一图像,例如,将在可视区当前显示的电量参数、时间参数、日期参数、网络参数、音量参数、音频的播放参数、通知消息或亮度参数移动隐藏,在可视区显示第一图像的全部信息,以恢复至初始状态。
当触发信号为长按触控按键触发的触发信号时,退出信号可以为触控按键复位后触发的退出信号,用户需要查看辅图像时,只需要将手指按压触控按键,然后转动头部,则可以查看到需要查看的辅助显示信息,松开手指后,则可以恢复至初始状态,用户可以快速查看到需要查看的信息,提高查看辅助显示信息的效率。当然,退出信号还可以为其他的信号,例如当触发信号为双击触控按键触发的触控信号时,退出信号可以为单击触控按键触发的退出信号。
在本申请实施例的描述中,需要理解的是,术语“上”、“下”、“左”、“右”“顶部”“底部”“左侧”“右侧”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的显示图像信息具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
本申请实施例提供的信息显示方法,将无需用户实时查看的辅图像信息预先生成在主图像周围,在辅图像功能未开启时,可视区只显示用户需要查看的主图像,这样在用户日常使用近眼显示设备时候辅图像不会实时地显示在用户眼前,对用户的视野产生干扰。并且还可以增大可视区主图的显示内容,提高显示效果。在实际操作上,当用户想查看隐藏的辅图像时,只需要触碰镜腿,然后转动头部,从可视区看到隐藏的辅图像,既在操作层面上有物理触碰的确认感,同时移动头部看隐藏内容的动作也会有空间视觉上的对应感,有利于减轻用户的认知负担以及近眼阅读时可能产生的晕眩感。用户松开手指则可退出辅图像查看功能,使近眼显示设备恢复初始显示主图像的状态,提高可查看隐藏的辅图像的效率。本申请实施例提供的近眼显示设备还可以包括处理器和存储介质,其中,处理器与存储器电性连接。处理器是近眼显示设备的控制中心,利用各种接口和线路连接整个近眼显示设备的各个部分,通过运行或加载存储在存储器内的计算机程序,以及调用存储在存储器内的数据,执行近眼显示设备的各种功能并处理数据。
存储器可用于存储软件程序以及模块,处理器通过运行存储在存储器的计算机程序以及模块,从而执行各种功能应用以及数据处理。存储器可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的计算机程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据电子设备的使用所创建的数据等。
此外,存储器可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器还可以包括存储器控制器,以提供处理器对存储器的访问。
在本申请实施例中,近眼显示设备中的处理器会按照如下的步骤,将一个或一个以上的计算机程序的进程对应的指令加载到存储器中,并由处理器运行存储在存储器中的计算机程序,从而实现各种功能,如下:
显示第一图像,所述第一图像是源图像的一部分;
响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
在一些实施例中,所述第二图像在所述源图像中的区域与所述第一图像在所述源图像中的区域有交集。
在一些实施例中,所述第二图像包括第一子图像区域和第二子图像区域,所述第二图像的第二子图像区域邻近第一子图像区域设置,所述第一子图像区域位于所述第二图像与所述第一图像在所述源图像中的交集区域内。
在一些实施例中,所述第二图像的第二子图像区域邻近第一图像边缘一侧设置。
在一些实施例中,所述近眼显示设备包括可视区,在所述根据所述姿态信息显示第二图像时,处理器用于执行:
根据所述姿态信息从所述源图像中确定出目标区域;
在所述可视区显示所述目标区域对应的第二图像。
在一些实施例中,在所述可视区显示的第一图像和第二图像的尺寸大小相同。
在一些实施例中,所述第二图像包括辅助显示信息,所述辅助显示信息包括所述近眼显示设备的参数信息和/或所述近眼显示设备连接的电子设备的参数信息中的一种或多种。
在一些实施例中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
在一些实施例中,在显示第一图像之后,处理器还用于执行:
获取所述可视区显示的第一图像在所述源图像上的坐标信息;
在获取所述近眼显示设备的姿态信息,根据所述姿态信息显示所述第二图像时,处理器还用于执行:
获取预设时间段内近眼显示设备的姿态信息;
根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
根据所述位置变化信息以及所述坐标信息从源图像中确定出目标区域;
在所述可视区显示所述目标区域对应的第二图像。
在一些实施例中,在根据所述姿态信息显示所述第二图像之后,处理器还用于执行:
响应于接收到的图像显示复位指令,所述近眼显示设备将显示的第二图像切换为显示第一图像。
在一些实施例中,所述图像显示更新指令的触发信号包括触控信号、语音信号、图像信号以及动作信号中的至少一种。
在一些实施例中,所述近眼显示设备包括佩戴组件,所述佩戴组件设置有触控模组,所述触控模组用于接收所述触控信号,所述触控模组接收到图像显示更新指令的触控信号,则所述近眼显示设备响应于所述图像显示更新指令,获取所述近眼显示设备的姿态信息。
本申请实施例提供的近眼显示设备10还可以包括:摄像头、麦克风、扬声器、LED灯等器件进行功能拓展(诸如通过扬声器播放歌曲,或者通过摄像头进行画面拍摄等,根据摄像头拍摄画面处理器在可视区显示与当前画面相关的交互数据),增加了近眼显示设备10的功能和可玩性。可以理解的是,近眼显示设备可以和其他电子设备(手机、平板电脑、智能手表、智能汽车等电子设部)进行数据交互,近眼显示设备10根据其他电子设备的指令实现相应的功能。
请继续参阅图18和图19,图18为本申请实施例提供的信息显示方法的第三流程示意图。图19为本申请实施例提供的电子设备的结构示意图。
本申请实施例还提供一种信息显示方法,应用于电子设备,电子设备可以是智能手机、平板电脑、掌上电脑(PDA,Personal Digital Assistant)、智能手表、智能手环等电子设备,如图19所示,电子设备可以为智能手机,信息显示方法包括:
401,将第一图像发送至近眼显示设备,第一图像属于源图像的一部分。
电子设备可以存储有源图像,源图像可以包括近眼显示设备待显示的图像信息,在一些实施例中,源图像也可以在服务器内生成,通过有线或无线传输的数据传输方式发送至电子设备。第一图像可以为用户当前需要查看的图像信息,例如,第一图像可以包括近眼显示设备或与近眼显示设备连接的电子设备的应用信息、通讯信息、音频信息以及视频信息等,源图像除了第一图像外还可以包括其他信息,诸如用户无需实时查看的辅助显示信息,辅助显示信息可以是近眼显示设备的参数信息和/或与近眼显示设备连接的电子设备的参数信息,诸如电量参数、网络参数、时间参数、音频播放参数以及显示参数等。辅助显示信息在源图像的位置可以在第一图像的位置周围,如设置在第一图像边缘的一侧,或者围绕第一图像边缘设置。
如图4至图6所示,图4为本申请实施例提供的源图像的第一示意图,图5为本申请实施例提供的源图像的第二示意图,图6为本申请实施例提供的源图像的第三示意图。图4至图6示例中,第一图像包括健身应用相关的应用信息,不同的是,图4示例中,辅助显示信息包括电量参数以及时间参数,图5示例中,辅助显示信息包括电量参数、时间参数以及网络参数,图6示例中,辅助显示信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数,其中,音频播放参数可以包括音量参数以及前当前音频应用播放状态参数,显示参数可以包括显示亮度参数。可以理解的是,图示中的第一图像和辅助显示信息只是示例性,第一图像还可以为根据实际需求设置的用户当前需要查看的图像信息,辅助显示信息还可以为根据实际需求设置用户无需实时查看的信息。
402,响应于图像显示更新指令,获取近眼显示设备的姿态信息。
其中,图像显示更新指令可以为根据触发信号生成的,其中,触发信号可以为用户通过触发模组所触发的触发信号,触发信号可以包括触控信号、语音信号、图像信号和动作信号中的至少一种,其中,触控模组可以为触控传感器,触控传感器用于获取触控信号,触控信号可以为按压信号和/或滑动信号。触控模块还可以为音频采集传感器,例如麦克风,用于采集语音信号,语音信号可以为通过麦克风采集到的满足预设条件的音频信号。触控模组还可以图像采集传感器,例如摄像头,用于采集图像信号,图像信号为通过摄像头采集到的满足预设条件的图像信号。触控模组可以为动作采集传感器,例如姿态传感器,动作信号可以为通过姿态传感器采集到的满足预设条件的动作信号。可以理解的是,还可以根据实际需 求设置其他类型的触控模组以及相应的触发信号。若接收上述触发信号,则根据该触发信号生成图像显示更新指令,电子设备接收图像显示更新指令,并响应于图像更新指令,电子设备获取近眼显示设备的姿态信息。
近眼显示设备可以包括姿态传感器,姿态传感器可以采集近眼显示设备的姿态信息,请参阅图1、图7以及图8,图7为本申请实施例提供的近眼显示设备的转动场景第一视图。图8为本申请实施例提供的近眼显示设备的转动场景第二视图。姿态传感器13用于检测近眼显示设备10的姿态信息。其中,姿态传感器13可以包括陀螺仪、电子罗盘、加速度传感器和/或霍尔传感器。姿态传感器13可以实现近眼显示设备的3个自由度检测(3degreeoffreedom,3DOF)或6个自由度检测(6degreeoffreedom,6DOF),以近眼显示设备可实现3DOF示例,近眼显示设备可以通过姿态传感器13检测第一自由度转动的姿态信息、第二自由度转动的姿态信息以及第三自由度转动的姿态信息。当姿态传感器采集到近眼显示设备的姿态信息后,发送至电子设备,以使电子设备获取近眼显示设备的姿态信息。
403,根据姿态信息确定出第二图像,第二图像属于源图像的一部分,第二图像与第一图像不同。
404,将第二图像发送至近眼显示设备。
关于步骤403~404:
电子设备根据获取到的近眼显示设备的姿态信息确定出第二图像,第二图像属于源图像的一部分,第二图像和第一图像不同,将第二图像发送至近眼显示设备,使近眼显示设备显示第二图像。
在一些实施例中,第二图像在源图像中的区域与第一图像在源图像中的区域有交集。请参阅图9和图10,图9为本申请实施例提供的近眼显示设备的第一显示示意图,图10为本申请实施例提供的近眼显示设备的第二显示示意图。在未响应图像显示更新指令之前,如图9所示,用户可以从近眼显示设备观察到的是第一图像,在响应于图像显示更新指令后,用户可以从近眼显示设备观察到的是第二图像,第二图像属于源图像的一部分,第二图像与第一图像不同,例如,如图10所示,第二图像在源图像中的位置与第一图像在源图像中的位置并不相同,第二图像在源图像中的区域与第一图像在源图像中的区域有交集。第一图像包括了健身应用的信息,第二图像包括了辅助显示信息,在实际应用场景中,当用户需要查看辅助显示信息中的电量参数时,可以通过触发模组触发相应的触发信号,通过姿态传感器采集近眼显示设备的姿态信息,根据姿态信息将用户需要查看的电量参数显示,此时,第二图像可以包括第一图像的一部分的应用信息和辅助显示信息。其中,第一图像在源图像的区域和第二图像在源图像的区域的交集部分为应用信息对应的部分区域。
在一些实施例中,第二图像包括第一子图像区域和第二子图像区域,第二图像的第二子图像区域邻近第一子图像区域设置,第一子图像区域位于第二图像与第一图像在源图像中的交集区域内。如图9和图10所示,第一子图像区域可以为包括部分应用信息的区域,第二子图像区域可以为包括辅助显示信息的区域,第一子图像区域位于第二图像与第一图像在源图像中的交集区域内,当然,第一子图像区域可以该交集区域完全重合,也可以与该交集区域部分重合。
在一些实施例中,第二图像的第二子图像区域邻近第一图像边缘一侧设置,如图10所示,第二图像的第二子图像区域与第一图像的一条侧边相邻设置,当然,在其他一些实施例中,第二图像的第二子图像区域与第一图像相邻的两条侧边相邻设置。如图11所示。
在一些实施例中,在根据姿态信息显示第二图像信息之前,近眼显示设备可以不显示第一图像,例如,当近眼显示设备处于低能耗模式或低电量状态时,可视区为黑屏状态,当用户通过触发信号生成图像显示更新指令时,通过姿态传感器采集近眼显示设备的姿态信息,根据姿态信息将第二图像显示,在处于低能耗模式或低电量模式时还可以满足用户对于近眼显示设备参数的读取,增加辅助显示信息的可读性。
本申请实施例提供的信息显示方法通过电子设备预先在近眼显示设备显示属于源图像一部分的第一图像,响应图像显示更新指令后获取近眼显示设备的姿态信息,根据姿态信息将属于源图像一部分的第二图像显示,第二图像可以包括一些无需实时查看的信息,在需要查看时根据近眼显示设备的姿态信息显示,显示内容的灵活度高,提高了近眼显示设备的显示效果。
在一些实施例中,所述根据所述姿态信息确定出第二图像可以包括:根据所述姿态信息从所述源图像中确定出目标区域;将所述第二图像发送至近眼显示设备包括:所述近眼显示设备包括可视区,将所述目标区域对应的第二图像发送至所述近眼显示设备,以使所述近眼显示设备在所述可视区显示所述目标区域对应的第二图像。
在一些实施例中,在可视区显示的第一图像和第二图像的尺寸大小相同。
在一些实施例中,第二图像包括辅助显示信息,所述辅助显示信息包括电子设备的参数信息和/或所述电子设备连接的近眼显示设备的参数信息中的一种或多种。
在一些实施例中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
在一些实施例中,在将第一图像发送至近眼显示设备之后,还包括:
获取所述可视区显示的第一图像在所述源图像上的坐标信息;
获取所述近眼显示设备的姿态信息,根据所述姿态信息确定出第二图像,将所述第二图像发送至近眼显示设备包括:
获取预设时间段内近眼显示设备的姿态信息;
根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
根据所述位置变化信息以及所述坐标信息从所述源图像中确定出目标区域;
将所述目标区域对应的第二图像发送至所述近眼显示设备。
可以理解的是,电子设备确定出目标区域的方式与近眼设备确定出目标区域的方式类似,在此不再赘述。
在一些实施例中,在将所述第二图像发送至所述近眼显示设备之后,所述信息显示方法还包括:
响应于接收到的图像显示复位指令,将所述第一图像发送至所述近眼显示设备。
电子设备响应于图像显示复位指令与近眼显示设备响应于图像复位指令的方式类似,在此不再赘述。
在本申请实施例中,电子设备中的处理器会按照如下的步骤,将一个或一个以上的计算机程序的进程对应的指令加载到存储器中,并由处理器运行存储在存储器中的计算机程序,从而实现各种功能,如下:
将第一图像发送至近眼显示设备,所述第一图像属于源图像的一部分;
响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
根据所述姿态信息确定出第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同;
将所述第二图像发送至近眼显示设备。
在一些实施例中,所述第二图像在所述源图像中的区域与所述第一图像在所述源图像中的区域有交集。
在一些实施例中,所述第二图像包括第一子图像区域和第二子图像区域,所述第二图像的第二子图像区域邻近第一子图像区域设置,所述第一子图像区域位于所述第二图像与所述第一图像在所述源图像中的交集区域内。
在一些实施例中,所述第二图像的第二子图像区域邻近第一图像边缘一侧设置。
在一些实施例中,在所述根据所述姿态信息确定出第二图像时,处理器还用于执行:
根据所述姿态信息从所述源图像中确定出目标区域;
将所述第二图像发送至近眼显示设备包括:
所述近眼显示设备包括可视区,将所述目标区域对应的第二图像发送至所述近眼显示设备,以使所述近眼显示设备在所述可视区显示所述目标区域对应的第二图像。
在一些实施例中,在所述可视区显示的第一图像和第二图像的尺寸大小相同。
在一些实施例中,所述第二图像包括辅助显示信息,所述辅助显示信息包括电子设备的参数信息和/或所述电子设备连接的近眼显示设备的参数信息中的一种或多种。
在一些实施例中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
在一些实施例中,在将第一图像发送至近眼显示设备之后,处理器还用于执行:获取所述可视区显示的第一图像在所述源图像上的坐标信息;
在获取所述近眼显示设备的姿态信息,根据所述姿态信息确定出第二图像,将所述第二图像发送至近眼显示设备时,处理器还用于执行:
获取预设时间段内近眼显示设备的姿态信息;
根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
根据所述位置变化信息以及所述坐标信息从所述源图像中确定出目标区域;
将所述目标区域对应的第二图像发送至所述近眼显示设备。
在一些实施例中,在将所述第二图像发送至所述近眼显示设备之后,处理器还用于执行:响应于接收到的图像显示复位指令,将所述第一图像发送至所述近眼显示设备。
以上对本申请实施例提供的信息显示方法、近眼显示设备以及电子设备进行了详细介绍。本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请。同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (26)

  1. 一种信息显示方法,应用于近眼显示设备,其中,所述方法包括:
    显示第一图像,所述第一图像是源图像的一部分;
    响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
    根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
  2. 根据权利要求1所述的信息显示方法,其中,所述第二图像在所述源图像中的区域与所述第一图像在所述源图像中的区域有交集。
  3. 根据权利要求2所述的信息显示方法,其中,所述第二图像包括第一子图像区域和第二子图像区域,所述第二图像的第二子图像区域邻近第一子图像区域设置,所述第一子图像区域位于所述第二图像与所述第一图像在所述源图像中的交集区域内。
  4. 根据权利要求3所述的信息显示方法,其中,所述第二图像的第二子图像区域邻近第一图像边缘一侧设置。
  5. 根据权利要求1所述的信息显示方法,其中,所述近眼显示设备包括可视区,所述根据所述姿态信息显示第二图像包括:
    根据所述姿态信息从所述源图像中确定出目标区域;
    在所述可视区显示所述目标区域对应的第二图像。
  6. 根据权利要求5所述的信息显示方法,其中,在所述可视区显示的第一图像和第二图像的尺寸大小相同。
  7. 根据权利要求1所述的信息显示方法,其中,所述第二图像包括辅助显示信息,所述辅助显示信息包括所述近眼显示设备的参数信息和/或所述近眼显示设备连接的电子设备的参数信息中的一种或多种。
  8. 根据权利要求7所述的信息显示方法,其中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
  9. 根据权利要求1所述的信息显示方法,其中,在显示第一图像之后,还包括:
    获取所述可视区显示的第一图像在所述源图像上的坐标信息;
    获取所述近眼显示设备的姿态信息,根据所述姿态信息显示所述第二图像步骤具体包括:
    获取预设时间段内近眼显示设备的姿态信息;
    根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
    根据所述位置变化信息以及所述坐标信息从源图像中确定出目标区域;
    在所述可视区显示所述目标区域对应的第二图像。
  10. 根据权利要求1-9任一项所述的信息显示方法,其中,在根据所述姿态信息显示所述第二图像之后,所述信息显示方法还包括:
    响应于接收到的图像显示复位指令,所述近眼显示设备将显示的第二图像切换为显示第一图像。
  11. 根据权利要求1-9任一项所述的信息显示方法,其中,所述图像显示更新指令的触发信号包括触控信号、语音信号、图像信号以及动作信号中的至少一种。
  12. 根据权利要求11所述的信息显示方法,其中,所述近眼显示设备包括佩戴组件,所述佩戴组件设置有触控模组,所述触控模组用于接收所述触控信号,所述触控模组接收到图像显示更新指令的触控信号,则所述近眼显示设备响应于所述图像显示更新指令,获取所述近眼显示设备的姿态信息。
  13. 一种信息显示方法,应用于电子设备,存储有源图像,其中,所述方法包括:
    将第一图像发送至近眼显示设备,所述第一图像属于源图像的一部分;
    响应于图像显示更新指令,获取所述近眼显示设备的姿态信息;
    根据所述姿态信息确定出第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同;
    将所述第二图像发送至近眼显示设备。
  14. 根据权利要求13所述的信息显示方法,其中,所述第二图像在所述源图像中的区域与所述第一图像在所述源图像中的区域有交集。
  15. 根据权利要求14所述的信息显示方法,其中,所述第二图像包括第一子图像区域和第二子图像区域,所述第二图像的第二子图像区域邻近第一子图像区域设置,所述第一子图像区域位于所述第二图像与所述第一图像在所述源图像中的交集区域内。
  16. 根据权利要求15所述的信息显示方法,其中,所述第二图像的第二子图像区域邻近第一图像边缘一侧设置。
  17. 根据权利要求13所述的信息显示方法,其中,所述根据所述姿态信息确定出第二图像包括:
    根据所述姿态信息从所述源图像中确定出目标区域;
    将所述第二图像发送至近眼显示设备包括:
    所述近眼显示设备包括可视区,将所述目标区域对应的第二图像发送至所述近眼显示设备,以使所述近眼显示设备在所述可视区显示所述目标区域对应的第二图像。
  18. 根据权利要求17所述的信息显示方法,其中,在所述可视区显示的第一图像和第二图像的尺寸大小相同。
  19. 根据权利要求13所述的信息显示方法,其中,所述第二图像包括辅助显示信息,所述辅助显示信息包括电子设备的参数信息和/或所述电子设备连接的近眼显示设备的参数信息中的一种或多种。
  20. 根据权利要求19所述的信息显示方法,其中,所述参数信息包括电量参数、网络参数、时间参数、音频播放参数、显示参数以及通知消息参数中的一种或者多种组合。
  21. 根据权利要求13所述的信息显示方法,其中,在将第一图像发送至近眼显示设备之后,还包括:
    获取所述可视区显示的第一图像在所述源图像上的坐标信息;
    获取所述近眼显示设备的姿态信息,根据所述姿态信息确定出第二图像,将所述第二图像发送至近眼显示设备包括:
    获取预设时间段内近眼显示设备的姿态信息;
    根据所述预设时间段内近眼显示设备的姿态信息得到近眼显示设备的位置变化信息;
    根据所述位置变化信息以及所述坐标信息从所述源图像中确定出目标区域;
    将所述目标区域对应的第二图像发送至所述近眼显示设备。
  22. 根据权利要求13-21任一项所述的信息显示方法,其中,在将所述第二图像发送至所述近眼显示设备之后,所述信息显示方法还包括:
    响应于接收到的图像显示复位指令,将所述第一图像发送至所述近眼显示设备。
  23. 一种近眼显示设备,其中,所述近眼显示设备用于执行权利要求1-12任一项所述的信息显示方法。
  24. 一种近眼显示设备,其中,所述近眼显示设备包括:
    显示装置,用于显示第一图像,所述第一图像是源图像的一部分;
    触控模组,用于响应于图像显示更新指令;
    姿态传感器,用于获取所述近眼显示设备的姿态信息;
    其中,显示装置还用于根据所述姿态信息显示第二图像,所述第二图像属于源图像的一部分,所述第二图像与所述第一图像不同。
  25. 根据权利要求24所述的近眼显示设备,其中,近眼显示设备还包括:
    佩戴组件,用于近眼显示设备的佩戴;
    其中,显示装置包括可视区,所述可视区设置于所述佩戴组件,所述可视区用于显示所述第二图像,触发模组包括触控按键,设置于所述佩戴组件,用于接收所述图像显示更新指令的触发信号。
  26. 一种电子设备,其中,所述电子设备用于执行权利要求13-22任一项所述的信息显示方法。
PCT/CN2022/113110 2021-09-16 2022-08-17 信息显示方法、近眼显示设备以及电子设备 WO2023040562A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111088751.X 2021-09-16
CN202111088751.XA CN115826734A (zh) 2021-09-16 2021-09-16 信息显示方法、近眼显示设备以及电子设备

Publications (1)

Publication Number Publication Date
WO2023040562A1 true WO2023040562A1 (zh) 2023-03-23

Family

ID=85515116

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/113110 WO2023040562A1 (zh) 2021-09-16 2022-08-17 信息显示方法、近眼显示设备以及电子设备

Country Status (2)

Country Link
CN (1) CN115826734A (zh)
WO (1) WO2023040562A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090309812A1 (en) * 2008-06-11 2009-12-17 Honeywell International Inc. Method and system for operating a near-to-eye display
CN102591016A (zh) * 2010-12-17 2012-07-18 微软公司 用于扩展现实显示的优化聚焦区
CN107533375A (zh) * 2015-06-29 2018-01-02 埃西勒国际通用光学公司 场景图像分析模块
CN108965656A (zh) * 2017-05-25 2018-12-07 佳能株式会社 显示控制设备、显示控制方法和存储介质
WO2019152619A1 (en) * 2018-02-03 2019-08-08 The Johns Hopkins University Blink-based calibration of an optical see-through head-mounted display
CN112882672A (zh) * 2021-02-26 2021-06-01 京东方科技集团股份有限公司 近眼显示控制方法、装置及近眼显示设备

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090309812A1 (en) * 2008-06-11 2009-12-17 Honeywell International Inc. Method and system for operating a near-to-eye display
CN102591016A (zh) * 2010-12-17 2012-07-18 微软公司 用于扩展现实显示的优化聚焦区
CN107533375A (zh) * 2015-06-29 2018-01-02 埃西勒国际通用光学公司 场景图像分析模块
CN108965656A (zh) * 2017-05-25 2018-12-07 佳能株式会社 显示控制设备、显示控制方法和存储介质
WO2019152619A1 (en) * 2018-02-03 2019-08-08 The Johns Hopkins University Blink-based calibration of an optical see-through head-mounted display
CN112882672A (zh) * 2021-02-26 2021-06-01 京东方科技集团股份有限公司 近眼显示控制方法、装置及近眼显示设备

Also Published As

Publication number Publication date
CN115826734A (zh) 2023-03-21

Similar Documents

Publication Publication Date Title
US9804682B2 (en) Systems and methods for performing multi-touch operations on a head-mountable device
CN110830811B (zh) 直播互动方法及装置、系统、终端、存储介质
CN109618212B (zh) 信息显示方法、装置、终端及存储介质
US9007301B1 (en) User interface
US20150009309A1 (en) Optical Frame for Glasses and the Like with Built-In Camera and Special Actuator Feature
KR20150026336A (ko) 웨어러블 디바이스 및 그 컨텐트 출력 방법
KR20160021284A (ko) 가상 오브젝트 방위 및 시각화
CN112181572A (zh) 互动特效展示方法、装置、终端及存储介质
CN110139116B (zh) 直播间切换方法、装置及存储介质
CN110300274B (zh) 视频文件的录制方法、装置及存储介质
WO2022134632A1 (zh) 作品处理方法及装置
WO2021043121A1 (zh) 一种图像换脸的方法、装置、系统、设备和存储介质
CN112835445B (zh) 虚拟现实场景中的交互方法、装置及系统
CN113938748B (zh) 视频播放方法、装置、终端、存储介质及程序产品
JP2022098268A (ja) 情報処理装置及びプログラム
CN111103975B (zh) 显示方法、电子设备及系统
CN111437600A (zh) 剧情展示方法、装置、设备及存储介质
WO2022057644A1 (zh) 设备交互方法、电子设备及交互系统
WO2017022769A1 (ja) ヘッドマウントディスプレイ、表示制御方法及びプログラム
CN112367533B (zh) 交互业务的处理方法、装置、设备及计算机可读存储介质
WO2022236996A1 (zh) 智能眼镜
CN112023403B (zh) 基于图文信息的对战过程展示方法及装置
CN110891181B (zh) 直播画面显示方法、装置、存储介质及终端
WO2023082980A1 (zh) 一种显示方法与电子设备
WO2023040562A1 (zh) 信息显示方法、近眼显示设备以及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22868938

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE