WO2023231749A9 - 基于光场显示装置的交互方法及相关设备 - Google Patents

基于光场显示装置的交互方法及相关设备 Download PDF

Info

Publication number
WO2023231749A9
WO2023231749A9 PCT/CN2023/094009 CN2023094009W WO2023231749A9 WO 2023231749 A9 WO2023231749 A9 WO 2023231749A9 CN 2023094009 W CN2023094009 W CN 2023094009W WO 2023231749 A9 WO2023231749 A9 WO 2023231749A9
Authority
WO
WIPO (PCT)
Prior art keywords
control
area
detail
operation panel
display
Prior art date
Application number
PCT/CN2023/094009
Other languages
English (en)
French (fr)
Other versions
WO2023231749A1 (zh
Inventor
范清文
陈丽莉
苗京花
王雪丰
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Publication of WO2023231749A1 publication Critical patent/WO2023231749A1/zh
Publication of WO2023231749A9 publication Critical patent/WO2023231749A9/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Definitions

  • the present application relates to the field of light field display technology, and in particular to an interaction method and related equipment based on a light field display device.
  • three-dimensional display objects appear differently from all angles. Therefore, three-dimensional display can present better display effects.
  • light field display objects such as light field three-dimensional models
  • the purpose of this application is to propose an interaction method and related equipment based on a light field display device.
  • the present application provides an interaction method based on a light field display device.
  • the display area of the light field display device includes an object display area and an operation panel area.
  • the object display area includes a first object.
  • the operation panel area includes first controls and second controls respectively corresponding to the first object and the second object, and the method includes:
  • the first object in the object display area is switched and displayed as the second object.
  • Embodiments of the present application also provide an interactive device based on a light field display device, including:
  • a display module configured to: in response to a first sliding operation on the operation panel area, switch the display positions of the first control and the second control in the operation panel area;
  • the first object in the object display area is switched and displayed as the second object.
  • An embodiment of the present application also provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor. When the processor executes the program, the method as described above is implemented.
  • Embodiments of the present application also provide a non-transitory computer-readable storage medium.
  • the non-transitory computer-readable storage medium stores computer instructions, and the computer instructions are used to cause the computer to execute the method as described above.
  • Embodiments of the present application also provide a computer program product, which includes computer program instructions.
  • the computer program instructions When the computer program instructions are run on a computer, they cause the computer to perform the method described in the preceding item.
  • the interaction method based on the light field display device switches the first control and the second control in response to the first sliding operation on the operation panel area.
  • a display position in the operation panel area and in response to the second control being at a target position in the operation panel area, switching the first object in the object display area to be displayed as the second object , enabling the user to control the first object and the second object by controlling the switching of the first control corresponding to the first object and the second control corresponding to the second object when viewing the first object (that is, the virtual three-dimensional object).
  • Display switching at the same time, by displaying the second control at the target position, it is convenient for users to quickly locate the current display object, which has high interaction efficiency, enriches interaction methods, meets the diverse needs of users, and improves user experience.
  • Figure 1 shows a schematic diagram of the hardware structure of an exemplary light field display device provided by an embodiment of the present application
  • Figure 2A shows a schematic diagram of an exemplary operation panel area disposed on the lower side of the object display area according to the embodiment of the present application
  • Figure 2B shows a schematic diagram of an exemplary operation panel area arranged on the upper side of the object display area according to the embodiment of the present application
  • Figure 2C shows a schematic diagram of an exemplary operation panel area arranged on the left side of the object display area according to the embodiment of the present application
  • Figure 2D shows a schematic diagram of an exemplary operation panel area arranged on the right side of the object display area according to the embodiment of the present application
  • Figure 3 shows a schematic flowchart of an exemplary method provided by the embodiment of the present application
  • Figure 4A shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application
  • Figure 4B shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application
  • Figure 4C shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application.
  • Figure 4D shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application
  • Figure 4E shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application.
  • Figure 4F shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application.
  • Figure 4G shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application.
  • Figure 4H shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application.
  • Figure 4I shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present application
  • Figure 5 shows a schematic structural diagram of an exemplary interactive device provided by an embodiment of the present application
  • Figure 6 shows a schematic workflow diagram of an exemplary interactive device provided by an embodiment of the present application.
  • Light field collection technology is relatively mature. It mainly provides 3D digital content. It can be promoted and used once collected. It does not require individual consumers to complete it. It is usually completed by a team. Therefore, the hardware cost, size, and power consumption of light field acquisition systems are highly acceptable in the market. In contrast, light field display is a user-oriented product, and individual users have more demands in terms of cost, size, power consumption, comfort, etc.
  • embodiments of the present application provide an interaction method for light field display, which controls the display state of an object (such as a virtual three-dimensional object, etc.) based on the touch gesture event input by the user, thereby realizing the interaction between the user and the virtual three-dimensional object and displaying the light field display. Controls corresponding to the displayed objects improve the user experience.
  • an object such as a virtual three-dimensional object, etc.
  • FIG. 1 shows a schematic diagram of an exemplary light field display device 100 provided by an embodiment of the present application.
  • the exemplary light field display device 100 may be an electronic device. As shown in FIG. 1 , the light field display device 100 may include: a processor 102 , a memory 104 , a network interface 106 , a peripheral interface 108 and a bus 110 . Among them, the processor 102, the memory 104, the network interface 106 and the peripheral interface 108 implement communication connections between each other within the device through the bus 110.
  • the processor 102 may be a central processing unit (CPU), an image processor, a neural network processor (NPU), a microcontroller (MCU), a programmable logic device, a digital signal processor (DSP), Application Specific Integrated Circuit (ASIC), or one or more integrated circuits.
  • processor 102 may be used to perform functions related to the techniques described in this disclosure.
  • processor 102 may also include multiple processors integrated into a single logical component. As shown in Figure 1, processor 102 may include multiple processors 102a, 102b, and 102c.
  • Bus 110 such as an internal bus (eg, processor-memory bus), external bus (USB port, PCI-E bus), etc., may be configured to connect between various components of electronic device 100 (eg, processor 102, memory 104, network Information is transmitted between interface 106 and peripheral interface 108).
  • internal bus e.g, processor-memory bus
  • USB port e.g.
  • PCI-E bus e.g., PCI-E bus
  • Memory 104 may be configured to store data (eg, instructions, computer code, etc.). As shown in FIG. 1 , the data stored in the memory may include program instructions (for example, program instructions for implementing the display method of the present disclosure) and data to be processed. The processor 102 can access program instructions and data stored in memory and execute the program instructions to operate on data to be processed. Memory 104 may include volatile storage or non-volatile storage. In some embodiments, memory 104 may include random access memory (RAM), read only memory (ROM), optical disks, magnetic disks, hard drives, solid state drives (SSD), flash memory, memory sticks, and the like.
  • RAM random access memory
  • ROM read only memory
  • SSD solid state drives
  • Network interface 106 may be configured to provide communication with other external devices to electronic device 100 via a network.
  • the network can be any wired or wireless network capable of transmitting and receiving data.
  • the network may be a wired network, a local wireless network (eg, Bluetooth, WiFi, Near Field Communication (NFC), etc.), a cellular network, the Internet, or a combination thereof. It is understood that the type of network is not limited to the specific examples above.
  • network interface 106 may include any combination of any number of network interface controllers (NICs), radio frequency modules, transceivers, modems, routers, gateways, adapters, cellular network chips, and the like.
  • NICs network interface controllers
  • the peripheral interface 108 may be configured to connect the electronic device 100 with one or more peripheral devices to implement information input and output.
  • peripheral devices may include input devices such as keyboards, mice, touch pads, touch screens, microphones, and various sensors, as well as output devices such as display screens, speakers, vibrators, and indicator lights.
  • peripheral devices may include touch devices such as touch screens (such as USB touch screens and touchpads) or touch pads (such as game controllers) to implement touch operations on displayed objects.
  • Peripheral devices may also include display screen devices such as LCD screens.
  • the LCD display is mainly composed of a backlight module and an LCD panel. The LCD panel itself does not emit light and relies on the light source provided by the backlight module to achieve brightness display. The LCD panel can be used in passive display mode and can be applied to high PPI display.
  • Light field display devices can be, for example, virtual reality (VR), augmented reality (AR), mixed reality (MR) and other thin and light near-eye display devices, light field displays and vehicle-mounted display devices. wait.
  • VR virtual reality
  • AR augmented reality
  • MR mixed reality
  • 2A to 2D show schematic diagrams of the display area of the liquid crystal display screen of the exemplary light field display device provided by the embodiment of the present application.
  • 2A is a schematic diagram of the operation panel area 130 being arranged on the lower side of the object display area 120;
  • FIG. 2B is a schematic diagram of the operation panel area 130 being arranged on the upper side of the object display area 120;
  • FIG. 2C is a schematic diagram of the operation panel area 130 being arranged on the upper side of the object display area 120.
  • FIG. 2D is a schematic diagram of the operation panel area 130 being disposed on the right side of the object display area 120.
  • the display area of the exemplary light field display device may include two areas with different display effects.
  • it may include an object display area. 120 and operation panel area 130.
  • the object display area 120 is provided with a display module that has a lens or other structure and is capable of realizing light field display, which can realize light field display of objects, that is, the displayed objects can be virtual three-dimensional objects.
  • the operation panel area 130 there is no need to set up a display module with a lens or other structure that can realize light field display. It is enough to set up a display module that realizes ordinary 2D display, and can display controls of virtual three-dimensional objects (such as controls and detail controls, etc.). ).
  • the operation panel area 130 may have different placement positions relative to the object display area 120, which may be determined based on display requirements.
  • the operation panel area 130 may be provided on the lower side of the object display area 120 (eg, FIG. 2A ).
  • the operation panel area 130 may also be provided on the upper side of the object display area 120 (for example, FIG. 2B).
  • the operation panel area 130 may also be provided on the left side of the object display area 120 (for example, FIG. 2C).
  • the operation panel area 130 can also be disposed on the right side of the object display area 120 (for example, FIG. 2D).
  • the device architecture may also include necessary components for normal operation. Other components.
  • the above device architecture may also include only the components necessary to implement the solutions of the embodiments of the present disclosure, and does not necessarily include all components shown in the figures.
  • the following describes the interaction method based on the light field display device according to the embodiment of the present application, taking the operation panel area 130 disposed below the object display area 120 as an example.
  • Figure 3 shows a schematic flowchart of an exemplary method provided by an embodiment of the present application.
  • the method 200 may be implemented by the light field display device 100 of FIG. 1 .
  • the method 200 may include the following steps.
  • the light field display device may first display a first light field display interface 301.
  • FIG. 4A shows a schematic diagram of an exemplary light field display interface according to an embodiment of the present disclosure.
  • the display device may enter the first light field display interface 301 in any manner.
  • the first light field display interface 301 is directly entered after the light field display device is turned on or through an application program.
  • in the object display area 120 there may be multiple objects for light field display (such as object 1, object 2, object 3, etc.).
  • the specific object displayed in the first light field display interface 301 may be one of the objects with the highest number of user votes (for example, the most selected by the user) determined based on the user questionnaire during product design ( For example object 1).
  • the first light field display interface 301 may be an interface that displays a first object 311 (for example, object 1 ).
  • the control can be an object identifier, such as object serial number or object name.
  • the object corresponding to this object identifier is the object itself.
  • the displayed controls may include a first control (eg, object 1 control 321) corresponding to the first object 311 (eg, object 1), a second control (eg, object 3) corresponding to the second object (eg, object 2 or object 3). 2 control 322a or object 3 control 322b).
  • the first control and the second control may both be located in the first operation area 320 and be in different positions in the first operation area 320 .
  • the control for example, the object 1 control 321 corresponding to the first object 311 (for example, the object 1) displayed in the object display area 120 may be at the target position 331.
  • the first operation area 320 may be a pie-shaped area
  • the target position 331 may be, for example, the middle position of the pie-shaped area. In this way, it is convenient for the user to intuitively understand the name of the object currently being displayed through the operation panel area 130 .
  • control at target location 331 may also be highlighted. In this way, it is convenient for the user to quickly identify the name of the object currently being displayed, etc. Highlighting can include highlighting or enlarging the display, etc. For example, as shown in FIG. 4A , the entire object 1 control 321 is highlighted.
  • the user can trigger the first operation area 330 of the operation panel area 130 through a touch peripheral device (such as a USB touch screen) to switch the object at the target position 331.
  • a touch peripheral device such as a USB touch screen
  • the triggering operation may be, for example, a sliding operation of an arc gesture, a sliding operation of a pull-up gesture, etc.
  • 4B to 4C show schematic diagrams of the second light field display interface 302 and the second light field display interface 303 obtained after the first sliding operation is performed on the operation panel area 130 according to an embodiment of the present disclosure.
  • step S202 in response to the first sliding operation on the operation panel area 130 , the display positions of the first control and the second control in the operation panel area 130 are switched. Specifically, the sliding operation in the corresponding direction can be performed according to the relative positional relationship between the second control (for example, the object 2 control 322a or the object 3 control 322b) and the first control (for example, the object 1 control 321). After switching, the second control (for example, the object 2 control 322a or the object 3 control 322b) can be placed at the position of the first control (for example, the object 1 control 321) (that is, the second control is at the target position 331).
  • the second control for example, the object 2 control 322a or the object 3 control 322b
  • the first sliding operation may be a sliding operation of an arc gesture.
  • a sliding operation is performed from the left side of the target position 331 to the left side of the target position 331 (for example, the middle position 325) toward the middle, that is, a sliding operation is performed from left to right.
  • the first sliding operation may also be a sliding operation from the right to the middle in, for example, a pie-shaped area, that is, a sliding operation from right to left.
  • the first sliding operation may be a sliding operation from left to right.
  • the second control (for example, the object 2 control 322a) can be switched in the pie-shaped area to the target position 331 (for example, in the middle of the pie-shaped area). Location).
  • the first sliding operation can be a sliding operation from right to left.
  • the second control (such as the object 3 control 322b) can be switched to the target position 331 (such as the middle of the pie area) in the pie-shaped area. s position).
  • step S204 in response to the second control (such as the object 2 control 322a or the object 3 control 322b) being at the target position 331 in the operation panel area 130, the first object in the object display area 120 is 311 (for example, object 1) is switched and displayed as the second object 312a (for example, object 2 or object 3).
  • the first object 311 for example, object 1
  • the second object 312 a for example, object 2
  • the first object 311 for example, object 1
  • the second object 312b for example, object 312 in the object display area 120 .
  • the user when viewing the first object 311 (that is, the virtual three-dimensional object), the user can control the first control corresponding to the first object 311 (such as the object 1 control 321) and the second object (such as the object 2 or object 321). 3) Switching of the corresponding second control (such as object 2 control 322a or object 3 control 322b) controls the display switching of the first object 311 and the second object (such as object 2 or object 3); At the same time, by displaying the second control at the target position 331 (such as the object 2 control 322a or the object 3 control 322b), it is convenient for the user to quickly locate the currently displayed object, which has high interaction efficiency, enriches the interaction mode, meets the diverse needs of the user, and improves the user experience.
  • the target position 331 such as the object 2 control 322a or the object 3 control 322b
  • the second control in response to the second control being at the target position 331 in the operation panel area 130, the second control may be highlighted.
  • the second control such as the object 2 control 322a
  • the entire second control such as the object 2 control 322a
  • the first control such as the object 2 control 322a
  • other controls such as object 3 control 322b
  • the second control (such as the object 3 control 322b) is highlighted, so that the entire second control (such as the object 3 control 322b) is in a highlighted state, which is different from the first control (The display color of other controls (such as object 1 control 321) and other controls (such as object 2 control 322a). In this way, it is convenient for the user to quickly identify the name of the object currently being displayed, etc.
  • Figure 4D shows a schematic diagram of an exemplary third light field display interface according to an embodiment of the present application.
  • the controls displayed in the operation panel area 130 may not only include first controls, but may also include third controls for adjusting the first object or the second object displayed in the object display area 120 .
  • the operation panel area 130 may also include a function for adjusting the display status (such as different sizes and different display angles) of the first object (such as object 1) or the second object (such as object 2 or object 3).
  • the adjustment parameter corresponding to the third control may include at least one of a vertical angle, a horizontal angle, and an object size.
  • the first object (eg object 1) or the second object (eg object 2 or object 3) has a display coefficient corresponding to the adjustment parameter of the third control.
  • the first object eg, object 1
  • the second object eg, object 2 or object 3
  • it will not exceed the object display area 120 .
  • the first object for example, object 1
  • the second object for example, object 2 or object 3
  • the third control may include a variety of third controls (eg, three types), and each third control has a corresponding adjustment parameter (eg, vertical angle, horizontal angle, or object size).
  • the third control may include a size control 341a for adjusting the object size of the first object (eg, object 1) or the second object (eg, object 2 or object 3), a size control 341a for adjusting the size of the first object (eg, object 2 or object 3), a vertical rotation control 341b for adjusting the vertical angle of object 1) or a second object (e.g. object 2 or object 3) and a horizontal angle for adjusting the first object (e.g. object 1) or a second object (e.g.
  • the horizontal rotation control 341c, etc. it is easier for the user to better determine the detailed display status corresponding to the first object (such as object 1) or the second object (such as object 2 or object 3), and to better display the first object (such as object 1) or a second object (such as object 2 or object 3), and improve the user's experience.
  • the third control may have multiple presentation forms.
  • the third control may include a control name corresponding to the adjustment parameter and an adjustment bar corresponding to the adjustment parameter.
  • the size control 341a may include a size control name (object size) and an adjustment bar of the corresponding size parameter.
  • the vertical rotation control 341b may include a control name (vertical rotation) and an adjustment bar of the corresponding vertical angle parameter.
  • the horizontal rotation control 341c may include a control name (horizontal rotation) and an adjustment bar of a corresponding horizontal angle parameter. In this way, by setting the control name corresponding to the adjustment parameter and the adjustment bar corresponding to the adjustment parameter, the user can intuitively feel the change process of the specific adjustment parameter.
  • the specific size of the adjustment bar of the third control (such as the adjustment bar of the size control 341a, the adjustment bar of the vertical rotation control 341b, and the adjustment bar of the horizontal rotation control 341c) is different from the object in the object display area (the first object or the adjustment bar of the horizontal rotation control 341c).
  • the specific dimensions and/or angles of the second object correspond respectively.
  • the actual display size of the corresponding object (the first object or the second object) will not exceed the object display area 120 .
  • the operation panel area 130 may have a second operation area 340 , and the third control may be disposed in the second operation area 340 . In this way, it is convenient for users to better operate in different operation areas and to display different operation controls in partitions.
  • the trigger operation can be, for example, a drag operation, etc.
  • it may also include, in response to a drag operation on the third control, adjusting the first object or the second object according to the adjustment parameter corresponding to the third control.
  • the drag operation may be to simultaneously shorten the adjustment bar of the size control 341a, the adjustment bar of the vertical rotation control 341b, and the adjustment bar of the horizontal rotation control 341c.
  • Adjust the size and angle of the second object (such as object 2) according to the adjustment bar of the size control 341a, the adjustment bar of the vertical rotation control 341b and the adjustment bar of the horizontal rotation control 341c, and display the adjusted (such as smaller and with a certain rotation) angle) of the second object (e.g. object 2).
  • Figure 4F shows a schematic diagram of an exemplary fourth light field display interface according to an embodiment of the present application.
  • the object displayed in the current object display area may have at least one detail, such as a first detail 313 (for example, detail 1), a second detail 314a (for example, detail 2), Second detail 314b (eg detail 3) or second detail 314c (eg detail 4), etc.
  • the control for example, the second control
  • the control may have a first detail control 323 corresponding to the first detail 313 of the object (for example, object 2), and a second detail control 324a, 324b corresponding to the second detail. Or the second detail control 324c, etc.
  • an object (such as object 1) that is not displayed in the current object display area may also have at least one detail, such as a first detail and a second detail.
  • the control (such as the first control) corresponding to the object (such as object 1) that is not displayed in the current object display area may have a first detail control corresponding to the first detail, and a second detail control corresponding to the second detail, etc. .
  • the first operation area 330 of the operation panel area 130 can be operated through a touch peripheral device (such as a USB touch screen).
  • a touch peripheral device such as a USB touch screen.
  • the triggering operation may be, for example, a sliding operation of a pull-up gesture, etc.
  • it may further include, in response to a second sliding operation for the first control or the second control at the target position, displaying a link with the first object or the second control in the operation panel area.
  • the first detail control and the second detail control corresponding to the object. For example, as shown in Figure 4F, you can slide upward on the second control (such as the object 2 control) located in the middle of the pie-shaped area, so that the second control (such as the object 2 control) can be displayed in a drawer form above the second control. 2)
  • the corresponding first detail control 323 for example, the control for detail 1
  • the second detail control for example, the control for detail 2, the control for detail 3, and the control for detail 4).
  • it may also include arranging the first detail control 323 and the second detail control 324a along the first direction in the order in which the first detail control 323 and the second detail control 324a are arranged. .
  • the arrangement order may be set according to the viewing frequencies corresponding to the first detail control 323 and the second detail control 324a, for example, from the highest viewing frequency to the lowest.
  • the first direction may be, for example, as shown in FIG. 4F , and may be a direction from left to right.
  • the first detail control 323 for example, detail 1
  • the second detail control for example, the second detail control 324a, 324a, The second detail control 324b or the third detail control 324c).
  • the viewing frequency may be the number of user votes for the first details and the second details determined by a user questionnaire during product design. Or it is the sum of the number of times the user voted for the first detail or the second detail and the number of times the user viewed the first detail or the second detail in historical viewing, which is determined based on the user questionnaire.
  • the method further includes, in response to determining that the first detail control 323 is arranged in front, highlighting the first detail control 323 .
  • the object display area in response to the second sliding operation on the first control or the second control at the target position being performed, for example, the touch of the sliding operation still occurs on the screen, then in the object display area, in the The first detail and the second detail are displayed in a first object or in the pair of second objects.
  • the first detail 313 corresponding to the first detail control 323 and the second detail 314a corresponding to the second detail control 324a are displayed in the second object (for example, object 2). From the display of the second object ( For example, perspective view of object 2).
  • the first object or the second control is displayed in the object display area.
  • a first detail of the second object For example, in the object display area, the display of the first details 313 and the second details 314a to 314c in the second object in Figure 4F is switched to that shown in Figure 4G, where only the second object (such as The first detail 313 corresponding to the first detail control 323 of object 2) (for example, detail 1).
  • a first object e.g. Object 1
  • a second object e.g. Object 2 or Object 3
  • you can adjust the first object such as object 1 or the second object ( For example, the display state of details of object 2 or object 3).
  • the trigger operation can be, for example, a drag operation, etc.
  • the method further includes, in response to the drag operation for the third control, adjusting the first details of the first object or the second object according to the adjustment parameters corresponding to the third control. adjust.
  • the drag operation may be to simultaneously shorten the adjustment bar of the size control 341a and the adjustment bar of the vertical rotation control 341b.
  • Adjust the size and display angle of the first detail 313 for example, detail 1 according to the adjustment bar of the size control 341a and the adjustment bar of the vertical rotation control 341b, and display the adjusted (for example, smaller and with a certain vertical rotation angle) first detail 313 Detail 313 (e.g. Detail 1).
  • the user can trigger the first operation area 330 of the operation panel area 130 through a touch peripheral device (such as a USB touch screen) to display the first object (such as object 1 ) or a second detail of a second object (e.g. object2 or object3).
  • the triggering operation may be a sliding operation, etc.
  • the method further includes, in response to a third sliding operation on the operation panel area, highlighting the second detail control; and displaying the first object or the first object in the object display area.
  • the third sliding operation can be a sliding operation from right to left above the pie-shaped area in Figure 4H.
  • the second detail control can be moved above the pie-shaped area (for example, above the second control).
  • 324a for example, the detail 2 control
  • the first detail 313 for example, the detail 1 in FIG. 4H
  • the second object for example, the object 2
  • the second detail 314a eg, detail 2 in 4I of the second object (eg, object 2).
  • the third sliding operation may be a sliding operation from left to right.
  • the first detail control 323 for example, as shown in Figure 4I
  • the second detail 314a for example, the detail 2 in 4I
  • the object 2 in 4I is switched to displaying the second object
  • First detail 313 of object 2 eg detail 1 in 4H.
  • the user can touch the first object in the operation panel area through a touch peripheral device (such as a USB touch screen).
  • the operation area performs a trigger operation to hide the first detail control and the second detail control in the operation panel area.
  • the triggering operation may be, for example, a fourth sliding operation or the like.
  • the method further includes, in response to a fourth sliding operation on the operation panel area, hiding the first detail control and the second detail control in the operation panel area.
  • a fourth sliding operation on the operation panel area hiding the first detail control and the second detail control in the operation panel area.
  • the first detail control 323 and all second detail controls corresponding to the second object for example, object 2
  • displaying details 2 is switched to displaying object 2 in the object display area (for example, as shown in FIG. 4E).
  • the method in the embodiment of the present application can be executed by a single device, such as a computer or server.
  • the method of this embodiment can also be applied in a distributed scenario, and is completed by multiple devices cooperating with each other.
  • one of the multiple devices can only execute one or more steps in the method of the embodiment of the present application, and the multiple devices will interact with each other to complete all the steps. method described.
  • FIG. 5 shows a schematic structural diagram of an exemplary device 400 provided by an embodiment of the present disclosure.
  • the interactive device 400 may include:
  • the display module 402 is configured to: in response to the first sliding operation on the operation panel area, switch the display positions of the first control and the second control in the operation panel area;
  • the first object in the object display area is switched and displayed as the second object.
  • the interaction device in the embodiment of the present application may also include a gesture operation module 404 and an arithmetic processing module 406.
  • the specific workflow diagram can be shown in Figure 6.
  • the gesture operation module 404 is mainly responsible for obtaining the user's gesture information as system input, recording the user's gesture information and feeding it back to the computing processing module.
  • light field display device electronic equipment can be introduced by detecting the starting position of the user's click on peripheral devices such as touch screens (such as USB touch screens and touchpads, etc.) or touch pads (such as game controllers, etc.) Provide the GestureDetector interface and recognize the gestures input by the user through this interface, for example, through the onTouchEvent(event) method to complete the recognition of different gestures.
  • peripheral devices such as touch screens (such as USB touch screens and touchpads, etc.) or touch pads (such as game controllers, etc.) Provide the GestureDetector interface and recognize the gestures input by the user through this interface, for example, through the onTouchEvent(event) method to complete the recognition of different gestures.
  • the operation processing module 406 is mainly responsible for receiving the user gesture information transmitted by the gesture operation module 404, including gesture type and gesture starting posture, etc., and controlling the interface display content of the display module 402 by processing this information. For example, the object interaction relationship in the system of gesture recognition results on the touchpad is mapped to control the object, and the processing results are returned to the display module. The display module performs corresponding display object switching based on the processing results, etc. At the same time, the computing processing module will also update the control arrangement direction (i.e. UI layout) of the operation panel area according to the processing results of gesture recognition, and return the processing results to the display module. The display module will display the corresponding operation panel area according to the processing results. Switch etc.
  • control arrangement direction i.e. UI layout
  • the display module 402 is further configured to highlight the second control in response to the second control being at a target position in the operation panel area.
  • the operation panel area includes a first operation area, the first control and the second control are located in the first operation area, and the first sliding operation is for the first operation area. sliding operation; the operation panel area includes a second operation area, and the second operation area includes at least one third control for adjusting the first object or the second object displayed in the object display area.
  • the display module 402 is further configured to: in response to the drag operation for the third control, adjust the first object or the second object according to the adjustment parameter corresponding to the third control.
  • the adjustment parameter includes at least one of a vertical angle, a horizontal angle, and an object size.
  • the display module 402 is further configured to: in response to a second sliding operation on the first control or the second control at the target position, display the first object or the second object in the operation panel area. Corresponding first detail control and second detail control.
  • the first detail control and the second detail control corresponding to the first object or the second object are displayed in the operation panel area, and the display module 402 is further configured to:
  • the first detail control and the second detail control are arranged along the first direction according to the arrangement order of the first detail control and the second detail control.
  • the display module 402 is further configured to set the arrangement order according to the viewing frequency corresponding to the first detail control and the second detail control.
  • the display module 402 is further configured to: in response to determining that the first detail control is arranged in front, highlight the first detail control;
  • First details of the first object or the second object are displayed in the object display area.
  • the display module 402 is further configured to: highlight the second detail control in response to the third sliding operation on the operation panel area;
  • Second details of the first object or the second object are displayed in the object display area.
  • the display module 402 is further configured to: in response to the fourth sliding operation on the operation panel area, hide the first detail control and the second detail control in the operation panel area.
  • the devices of the above embodiments are used to implement the corresponding interaction method 200 based on the light field display device in any of the foregoing embodiments, and have the beneficial effects of the corresponding method embodiments, which will not be described again here.
  • the present application also provides a non-transitory computer-readable storage medium, the non-transitory computer-readable storage medium stores computer instructions, and the computer instructions use The computer is caused to execute the interaction method 200 based on the light field display device as described in any of the above embodiments.
  • the computer-readable media in this embodiment include permanent and non-permanent, removable and non-removable media, and information storage can be implemented by any method or technology.
  • Information may be computer-readable instructions, data structures, modules of programs, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), and read-only memory.
  • PRAM phase change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • read-only memory read-only memory
  • ROM read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • flash memory or other memory technology
  • compact disc read-only memory CD-ROM
  • DVD digital versatile disc
  • Magnetic tape cassettes tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium can be used to store information that can be accessed by a computing device.
  • the computer instructions stored in the storage medium of the above embodiments are used to cause the computer to execute the interaction method 200 based on the light field display device as described in any of the above embodiments, and have the beneficial effects of the corresponding method embodiments, which will not be discussed here. Repeat.
  • this application also provides a computer program product.
  • the computer program product When the computer program product is run on a computer, the computer can execute the folding method described in any of the above embodiments. Display adjustment method for screen electronic equipment.
  • DRAM dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种基于光场显示装置的交互方法及相关设备。通过响应于针对所述操作面板区的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区中的显示位置;以及响应于所述第二控件处于所述操作面板区中的目标位置,将所述对象显示区中的所述第一对象切换显示为所述第二对象,能够使得用户在观看第一对象(也即虚拟三维对象)时,通过控制与第一对象对应的第一控件和与第二对象对应的第二控件的切换,控制第一对象和第二对象的显示切换;同时通过展示目标位置的第二控件,便于用户快速定位当前显示对象,具有高效的交互效率,丰富了交互方式,满足了用户多样化需求,提升了用户体验。

Description

基于光场显示装置的交互方法及相关设备
相关申请的交叉引用
本公开以申请号为:202210629333.5,申请日为:2022年05月31日,发明名称为:基于光场显示装置的交互方法及相关设备,该中国专利申请的全部内容通过引用全部并入本文。
技术领域
本申请涉及光场显示技术领域,尤其涉及一种基于光场显示装置的交互方法及相关设备。
背景技术
三维显示中,物体在各个角度的显示画面都是不同的。因此,三维显示能够呈现更好的显示效果。相关技术的光场显示中,对于光场显示物体(例如光场三维模型)的交互单一,无法满足用户的交互需。
发明内容
有鉴于此,本申请的目的在于提出一种基于光场显示装置的交互方法及相关设备。
基于上述目的,本申请提供了一种基于光场显示装置的交互方法,所述光场显示装置的显示区域包括对象显示区和操作面板区,所述对象显示区中包括第一对象,所述操作面板区包括与所述第一对象和第二对象分别对应的第一控件和第二控件,所述方法包括:
响应于针对所述操作面板区的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区中的显示位置;以及
响应于所述第二控件处于所述操作面板区中的目标位置,将所述对象显示区中的所述第一对象切换显示为所述第二对象。
本申请实施例还提供了一种基于光场显示装置的交互装置,包括:
显示模块,被配置为:响应于针对所述操作面板区的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区中的显示位置;以及
响应于所述第二控件处于所述操作面板区中的目标位置,将所述对象显示区中的所述第一对象切换显示为所述第二对象。
本申请实施例还提供了一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如前所述的方法。
本申请实施例还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使计算机执行如前所述方法。
本申请实施例还提供了一种计算机程序产品,包括计算机程序指令,当所述计算机程序指令在计算机上运行时,使得计算机执行如前任一项所述的方法。
从上面实施例可以看出,本申请提供的基于光场显示装置的交互方法,通过响应于针对所述操作面板区的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区中的显示位置;以及响应于所述第二控件处于所述操作面板区中的目标位置,将所述对象显示区中的所述第一对象切换显示为所述第二对象,能够使得用户在观看第一对象(也即虚拟三维对象)时,通过控制与第一对象对应的第一控件和与第二对象对应的第二控件的切换,控制第一对象和第二对象的显示切换;同时通过展示目标位置的第二控件,便于用户快速定位当前显示对象,具有高效的交互效率,丰富了交互方式,满足了用户多样化需求,提升了用户体验。
附图说明
为了更清楚地说明本申请或相关技术中的技术方案,下面将对实施例或相关技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1示出了本申请实施例所提供的示例性光场显示装置的硬件结构示意图;
图2A示出了本申请实施例的示例性操作面板区设置在对象显示区的下侧的示意图;
图2B示出了本申请实施例的示例性操作面板区设置在对象显示区的上侧的示意图;
图2C示出了本申请实施例的示例性操作面板区设置在对象显示区的左侧的示意图;
图2D示出了本申请实施例的示例性操作面板区设置在对象显示区的右侧的示意图;
图3示出了本申请实施例所提供的示例性方法的流程示意图;
图4A示出了本申请实施例的一种示例性光场显示界面的示意图;
图4B示出了本申请实施例的一种示例性光场显示界面的示意图;
图4C示出了本申请实施例的一种示例性光场显示界面的示意图;
图4D示出了本申请实施例的一种示例性光场显示界面的示意图;
图4E示出了本申请实施例的一种示例性光场显示界面的示意图;
图4F示出了本申请实施例的一种示例性光场显示界面的示意图;
图4G示出了本申请实施例的一种示例性光场显示界面的示意图;
图4H示出了本申请实施例的一种示例性光场显示界面的示意图;
图4I示出了本申请实施例的一种示例性光场显示界面的示意图;
图5示出了本申请实施例所提供的示例性交互装置的结构示意图;
图6示出了本申请实施例所提供的示例性交互装置的工作流程示意图。
具体实施方式
为使本申请的目的、技术方案和优点更加清楚明白,以下结合具体实施例,并参照附图,对本申请进一步详细说明。
需要说明的是,除非另外定义,本申请实施例使用的技术术语或者科学术语应当为本申请所属领域内具有一般技能的人士所理解的通常意义。本申请实施例中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。“包括”或者“包含”等类似的词语意指出现该词前面的元件或者物件涵盖出现在该词后面列举的元件或者物件及其等同,而不排除其他元件或者物件。“连接”或者“相连”等类似的词语并非限定于物理的或者机械的连接,而是可以包括电性的连接,不管是直接的还是间接的。“上”、“下”、“左”、“右”等仅用于表示相对位置关系,当被描述对象的绝对位置改变后,则该相对位置关系也可能相应地改变。
对于光场技术的研究主要分为两大方面,包括光场采集和光场显示。光场采集技术相对成熟,主要是提供3D数字内容,一次采集即可以推广使用,且并不要求由个体消费者来完成,一般都是由一个团队来完成。因此市场上对于光场采集系统的硬件成本、体积、功耗都有很大的可接受度。而相比之下,光场显示是偏向用户的产品,个体用户在成本、体积、功耗、舒适度等多方面都有较多需求。目前,针对光场显示的产品,用户与虚拟三维对象(例如虚拟三维光场模型)之间的交互无法实时显示具体的手势操作,且光场显示的界面无法让用户获取清晰的文字信息,使得用户的体验感较差,难以满足用户的需求。
基于此,本申请实施例提供了一种光场显示的交互方法,基于用户输入的触控手势事件控制对象(例如虚拟三维对象等)的显示状态,从而实现用户与虚拟三维对象的交互,显示与显示的对象对应的控件等,提升了用户体验。
图1示出了本申请实施例所提供的示例性光场显示装置100的示意图。
该示例性的光场显示装置100可以为一种电子设备。如图1所示,光场显示装置100可以包括:处理器102、存储器104、网络接口106、外围接口108和总线110。其中,处理器102、存储器104、网络接口106和外围接口108通过总线110实现彼此之间在设备内部的通信连接。
处理器102可以是中央处理器(Central Processing Unit,CPU)、图像处理器、神经网络处理器(NPU)、微控制器(MCU)、可编程逻辑器件、数字信号处理器(DSP)、 应用专用集成电路(Application Specific Integrated Circuit,ASIC)、或者一个或多个集成电路。处理器102可以用于执行与本公开描述的技术相关的功能。在一些实施例中,处理器102还可以包括集成为单一逻辑组件的多个处理器。如图1所示,处理器102可以包括多个处理器102a、102b和102c。
总线110,诸如内部总线(例如,处理器-存储器总线)、外部总线(USB端口、PCI-E总线)等,可以被配置为在电子设备100的各个组件(例如处理器102、存储器104、网络接口106和外围接口108)之间传输信息。
存储器104可以配置为存储数据(例如,指令、计算机代码等)。如图1所示,存储器存储的数据可以包括程序指令(例如,用于实现本公开的显示方法的程序指令)以及要处理的数据。处理器102可以访问存储器存储的程序指令和数据,并且执行程序指令以对要处理的数据进行操作。存储器104可以包括易失性存储装置或非易失性存储装置。在一些实施例中,存储器104可以包括随机访问存储器(RAM)、只读存储器(ROM)、光盘、磁盘、硬盘、固态硬盘(SSD)、闪存、存储棒等。
网络接口106可以配置为经由网络向电子设备100提供与其他外部设备的通信。该网络可以是能够传输和接收数据的任何有线或无线的网络。例如,该网络可以是有线网络、本地无线网络(例如,蓝牙、WiFi、近场通信(NFC)等)、蜂窝网络、因特网、或上述的组合。可以理解的是,网络的类型不限于上述具体示例。在一些实施例中,网络接口106可以包括任意数量的网络接口控制器(NIC)、射频模块、接收发器、调制解调器、路由器、网关、适配器、蜂窝网络芯片等的任意组合。
外围接口108可以配置为将电子设备100与一个或多个外围装置连接,以实现信息输入及输出。例如,外围装置可以包括键盘、鼠标、触摸板、触摸屏、麦克风、各类传感器等输入设备以及显示屏、扬声器、振动器、指示灯等输出设备。
在一些实施例中,外围装置可以包括触摸屏(例如usb触摸屏和touchpad等)或者触摸板(例如游戏手柄等)等触控设备,以实现对于显示的对象的触控操作等。外围装置还可以包括液晶显示屏等显示屏设备。液晶显示屏主要由背光模组和液晶显示面板构成。液晶显示面板本身不发光,需要依靠背光模组提供的光源实现亮度显示。该液晶显示面板可用于被动式显示模式中,可以应用于高PPI显示。光场显示装置可以例如虚拟现实(Virtual Reality,简称VR)、增强现实(Augmented Reality,简称AR)、混合现实(Mixed Reality,简称MR)等轻薄化的近眼显示装置、光场显示器以车载显示装置等。
图2A~图2D示出了本申请实施例所提供的示例性的光场显示装置的液晶显示屏的显示区域的示意图。其中,图2A为操作面板区130设置在对象显示区120的下侧的示意图;图2B为操作面板区130设置在对象显示区120的上侧的示意图;图2C为操作面板区130设置在对象显示区120的左侧的示意图;图2D为操作面板区130设置在对象显示区120的右侧的示意图。
在一些实施例中,如图2A~图2D所示,本申请实施例所提供的示例性的光场显示装置的显示区域,可以包括两个具有不同显示效果的区域,例如可以包括对象显示区120和操作面板区130。其中,在对象显示区120设置有具有透镜等结构的能够实现光场显示的显示模组,能够实现对对象的光场显示,也即显示的对象可以为虚拟三维对象。而在操作面板区130无需设置具有透镜等结构的能够实现光场显示的显示模组,设置实现普通的2D显示的显示模组即可,能够显示虚拟三维对象的控件(例如控件和细节控件等)。
在一些实施例中,操作面板区130相对对象显示区120可以有不同的设置位置,具体可以根据显示需求确定。例如,操作面板区130可以设置在对象显示区120的下侧(例如图2A)。操作面板区130还可以设置在对象显示区120的上侧(例如图2B)。操作面板区130还可以设置在对象显示区120的左侧(例如图2C)。操作面板区130还可以设置在对象显示区120的右侧(例如图2D)。
需要说明的是,尽管上述设备架构仅示出了处理器102、存储器104、网络接口106、外围接口108和总线110,但是在具体实施过程中,该设备架构还可以包括实现正常运行所必需的其他组件。此外,本领域的技术人员可以理解的是,上述设备架构中也可以仅包含实现本公开实施例方案所必需的组件,而不必包含图中所示的全部组件。
以下以操作面板区130设置在对象显示区120的下侧为例对本申请实施例的基于光场显示装置的交互方法进行说明。
图3示出了本申请实施例所提供的示例性方法的流程示意图。该方法200可以由图1的光场显示装置100来实施。参考图3所示,该方法200可以包括以下步骤。
在初始状态下,光场显示装置可以先显示一个第一光场显示界面301。图4A示出了根据本公开实施例的一个示例性光场显示界面的示意图。显示装置进入该第一光场显示界面301的方式可以是任意的。例如,在光场显示装置开机后或者通过某个应用程序直接进入该第一光场显示界面301。在一些实施例中,在对象显示区120,用于光场显示的对象可以有多个(例如物体1,物体2和物体3等)。开机后,该第一光场显示界面301中,显示的具体对象可以是基于产品设计时的用户调查问卷而确定的用户投票次数最高(例如被用户选中次数最多的)的对象中的一个对象(例如物体1)。或者是基于用户调查问卷而确定的用户投票次数和用户历史观看中观看次数的总和最多的对象(例如物体1)。如图4A所示,该第一光场显示界面301可以为显示第一对象311(例如物体1)的界面。
如图4A所示,该第一光场显示界面301中,在操作面板区130,可以展示与多个对象一一对应的多个控件。该控件可以是对象标识,例如对象序号或者对象名称等。该对象标识对应的对象即为对象本身。例如,展示的控件可以包括与第一对象311(例如物体1)对应的第一控件(例如物体1控件321),与第二对象(例如物体2或物体3)对应的第二控件(例如物体2控件322a或物体3控件322b)。在一些实施例中,该第一控件和第二控件,可以均位于第一操作区320,且在第一操作区320中分别处于不同的位置。其中,与对象显示区120中显示的第一对象311(例如物体1)对应的控件(例如物体1控件321)可以处于目标位置331。在一些实施例中,第一操作区320可以为饼状区域,目标位置331可以例如饼状区域的中间的位置。这样,可以方便用户通过操作面板区130直观的了解当前处于显示状态的对象的名称等。
在一些实施例中,还可以对目标位置331的控件进行突出显示。这样,可以方便用户快速地识别出当前处于显示状态的对象的名称等。突出显示可以包高亮显示或放大显示等。例如,图4A所示,对整个物体1控件321进行高亮显示。
接着,当用户需要切换对象显示区120的第一对象时,用户可以通过触控外围装置(例如usb触摸屏)对操作面板区130的第一操作区330进行触发操作,来切换处于目标位置331的控件。触发操作可以例如弧形手势的滑动操作和上拉手势的滑动操作等。
图4B~图4C示出了根据本公开实施例的操作面板区130进行第一滑动操作后,切换得到的第二光场显示界面302和第二光场显示界面303的示意图。
在步骤S202,响应于针对所述操作面板区130的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区130中的显示位置。具体可以根据第二控件(例如物体2控件322a或物体3控件322b)与第一控件(例如物体1控件321)的相对位置关系进行对应方向的滑动操作。切换后,即可以使第二控件(例如物体2控件322a或物体3控件322b)处于第一控件(例如物体1控件321)的位置(也即使第二控件处于目标位置331)。
在一些实施例中,第一滑动操作可以为弧形手势的滑动操作。在操作面板的饼状区域中自目标位置331的左侧方向目标位置331(例如中间位置325)左侧向中间的滑动操作,也即自左向右的滑动操作。第一滑动操作还可以为在例如饼状区域中自右侧向中间的滑动操作,也即自右向左的滑动操作。例如图4B所示,第一滑动操作可以为自左向右的滑动操作,这样,可以在饼状区域将第二控件(例如物体2控件322a)切换至目标位置331(例如饼状区域中间的位置)。或者例如图4C所示,第一滑动操作可以为自右向左的滑动操作,这样,可以在饼状区域将第二控件(例如物体3控件322b)切换至目标位置331(例如饼状区域中间的位置)。
在步骤S204,响应于所述第二控件(例如物体2控件322a或物体3控件322b)处于所述操作面板区130中的目标位置331,将所述对象显示区120中的所述第一对象311(例如物体1)切换显示为所述第二对象312a(例如物体2或物体3)。例如图4B所示,在对象显示区120中将第一对象311(例如物体1)切换显示为第二对象312a(例如物体2)。或者例如图4C所示,在对象显示区120中将第一对象311(例如物体1)切换显示为第二对象312b(例如物体3)。这样,能够使得用户在观看第一对象311(也即虚拟三维对象)时,通过控制与第一对象311对应的第一控件(例如物体1控件321)和与第二对象(例如物体2或物体3)对应的第二控件(例如物体2控件322a或物体3控件322b)的切换,控制第一对象311和第二对象(例如物体2或物体3)的显示切换; 同时通过展示目标位置331的第二控件(例如物体2控件322a或物体3控件322b),便于用户快速定位当前显示对象,具有高效的交互效率,丰富了交互方式,满足了用户多样化需求,提升了用户体验。
在一些实施例中,响应于所述第二控件处于所述操作面板区130中的目标位置331,可以对所述第二控件进行突出显示。例如图4B所示,对第二控件(例如物体2控件322a)进行高亮显示,使整个第二控件(例如物体2控件322a)均处于高亮显示状态,具有区别与第一控件(例如物体1控件321)和其他的控件(例如物体3控件322b)等的显示颜色。或者,例如图4C所示,对第二控件(例如物体3控件322b)进行高亮显示,使整个第二控件(例如物体3控件322b)均处于高亮显示状态,具有区别与第一控件(例如物体1控件321)和其他的控件(例如物体2控件322a)等的显示颜色。这样,可以方便用户快速地识别出当前处于显示状态的对象的名称等。
图4D示出了根据本申请实施例的示例性第三光场显示界面示意图。
在一些实施例中,在操作面板区130展示的控件可以不仅仅包括第一控件,还可以包括用于调节所述对象显示区120中显示的第一对象或第二对象的第三控件。例如图4D所示,在操作面板区130还可以包括用于调节第一对象(例如物体1)或第二对象(例如物体2或物体3)的显示状态(例如不同大小和不同显示角度)的至少一个第三控件。第三控件对应的调节参数可以包括垂直角度、水平角度、对象大小中的至少其一。第一对象(例如物体1)或第二对象(例如物体2或物体3)具有与第三控件的调节参数对应的显示系数。在一些实施例中,例如图4A~图4C所示,第一对象(例如物体1)或第二对象(例如物体2或物体3)处于最大显示系数时,不会超出对象显示区120。这样,可以避免调节后,第一对象(例如物体1)或第二对象(例如物体2或物体3)显示不全,提高用户的观看体验。
在一些实施例中,第三控件可以包括多种第三控件(例如三种),每种第三控件分别具有一种对应的调节参数(例如垂直角度、水平角度或对象大小)。例如图4D所示,第三控件可以包括用于调节第一对象(例如物体1)或第二对象(例如物体2或物体3)的对象大小的尺寸控件341a、用于调节第一对象(例如物体1)或第二对象(例如物体2或物体3)的垂直角度的垂直旋转控件341b和用于调节第一对象(例如物体1)或第二对象(例如物体2或物体3)的水平角度的水平旋转控件341c等。这样,可以便于用户更好的确定第一对象(例如物体1)或第二对象(例如物体2或物体3)对应的细化的显示状态等,更好地展示第一对象(例如物体1)或第二对象(例如物体2或物体3),并提升用户的体验。
在一些实施例中,第三控件可以有多种展现形式。例如图4D所示,第三控件可以包括与调节参数对应的控件名称和与调节参数对应的调节条。例如尺寸控件341a可以包括尺寸控件名称(物体大小)和对应的尺寸参数的调节条。垂直旋转控件341b可以包括控件名称(垂直旋转)和对应的垂直角度参数的调节条。水平旋转控件341c可以包括控件名称(水平旋转)和对应的水平角度参数的调节条。这样,通过设置与调节参数对应的控件名称和与调节参数对应的调节条,能够便于用户直观的感受具体的调节参数的变化过程。
应当理解的是,第三控件的调节条(例如尺寸控件341a的调节条、垂直旋转控件341b的调节条和水平旋转控件341c的调节条)的具体尺寸与对象显示区的对象(第一对象或第二对象)的具体尺寸和或角度是分别对应的。且当各个调节条被拖拽至最大尺寸时,对应的对象(第一对象或第二对象)的实际显示尺寸不会超出对象显示区120。
在一些实施例中,例如图4D所示,操作面板区130可以具有第二操作区340,第三控件可以设置在第二操作区340。这样,可以方便用户更好地在不同操作区进行操作,且便于分区展示不同的操作控件。
假设用户希望从不同角度观看第一对象(例如物体1)或第二对象(例如物体2或物体3),或者希望观察不同大小的第一对象(例如物体1)或第二对象(例如物体2或物体3),可以通过触控外围装置(例如usb触摸屏)对操作面板区130的第二操作区340进行触发操作,来调节第一对象(例如物体1)或第二对象(例如物体2或物体3)的显示状态。触发操作可以例如拖拽操作等。
在一些实施例中,还可以包括,响应于针对所述第三控件的拖拽操作,按照所述第三控件对应的调节参数对所述第一对象或第二对象进行调节。例如图4E所示,拖拽操作可以为同时缩短尺寸控件341a的调节条、垂直旋转控件341b的调节条和水平旋转控件341c的调节条。按照尺寸控件341a的调节条、垂直旋转控件341b的调节条和水平旋转控件341c的调节条来调节第二对象(例如物体2)的大小和角度,并显示调节后(例如变小和具有一定旋转角度)的第二对象(例如物体2)。
图4F示出了根据本申请实施例的示例性第四光场显示界面示意图。
在一些实施例中例如图4F所示,当前对象显示区显示的对象(例如物体2)可以具有至少一个细节,例如第一细节313(例如细节1)、第二细节314a(例如细节2)、第二细节314b(例如细节3)或第二细节314c(例如细节4)等。对应地,控件(例如第二控件)可以具有与对象(例如物体2)的第一细节313对应的第一细节控件323,和与第二细节对应的第二细节控件324a、第二细节控件324b或第二细节控件324c等。可以理解的是,当前对象显示区没有显示的对象(例如物体1)也可以具有至少一个细节,例如第一细节和第二细节等。对应地,当前对象显示区没有显示的对象(例如物体1)对应的控件(例如第一控件)可以具有与第一细节对应的第一细节控件,和与第二细节对应的第二细节控件等。
假设用户希望观看第一对象(例如物体1)或第二对象(例如物体2或物体3)的细节,可以通过触控外围装置(例如usb触摸屏)对操作面板区130的第一操作区330进行触发操作,来显示第一对象(例如物体1)或第二对象(例如物体2或物体3)的细节。触发操作可以例如上拉手势的滑动操作等。
在一些实施例中,还可以包括,响应于针对处于所述目标位置的第一控件或第二控件的第二滑动操作,在所述操作面板区显示与所述第一对象或所述第二对象对应的第一细节控件和第二细节控件。例如图4F所示,可以在处于饼状区域的中间位置的第二控件(例如物体2控件)处向上滑动,即可在第二控件的上方以抽屉形式显示与所述第二对象(例如物体2)对应的第一细节控件323(例如细节1的控件)和第二细节控件(例如细节2的控件、细节3的控件和细节4的控件)。
在一些实施例中,还可以包括,按照所述第一细节控件323和所述第二细节控件324a的排列顺序,沿第一方向排列所述第一细节控件323和所述第二细节控件324a。可以根据所述第一细节控件323和所述第二细节控件324a对应的查看频率设置,例如从查看频率最高到最低设置排列顺序。第一方向可以例如图4F所示,可以为从左向右的方向。此时,可以自第二控件上方的中间位置325处,按照从左向右的方向显示具有排列顺序的第一细节控件323(例如细节1)和第二细节控件(例如第二细节控件324a、第二细节控件324b或第三细节控件324c)。
在一些实施例中,查看频率可以为产品设计时的用户调查问卷而确定的用户对第一细节和第二细节的投票次数。或者是基于用户调查问卷而确定的用户对第一细节或第二细节的投票次数和用户历史观看中观看对第一细节或第二细节的次数的总和。
在一些实施例中,还包括,响应于确定所述第一细节控件323排列在前,将所述第一细节控件323进行高亮显示。
接着,响应于针对处于所述目标位置的第一控件或第二控件的第二滑动操作在进行中,例如滑动操作的触控还在屏幕中发生,则在所述对象显示区,在所述第一对象中展示所述第一细节和第二细节或在所述对第二对象中展示所述第一细节和第二细节。例如,图4F所示,在第二对象(例如物体2)中展示与第一细节控件323对应的第一细节313和与第二细节控件324a对应的第二细节314a,从展示第二对象(例如物体2)的透视图。
接着,响应于针对处于所述目标位置的第一控件或第二控件的第二滑动操作结束,例如滑动操作的触控已离开屏幕,则在所述对象显示区,展示所述第一对象或所述第二对象的第一细节。例如,在对象显示区,将图4F中的在所述第二对象中展示所述第一细节313和第二细节314a~第二细节314c切换为图4G所示,仅展示第二对象(例如物体2)的第一细节控件323对应的第一细节313(例如细节1)。
假设用户希望从不同角度观看第一对象(例如物体1)或第二对象(例如物体2或物体3)的细节,或者希望观察不同大小的第一对象(例如物体1)或第二对象(例如物体2或物体3)的细节,可以通过触控外围装置(例如usb触摸屏)对操作面板区130的第二操作区340进行触发操作,来调节第一对象(例如物体1)或第二对象(例如物体2或物体3)的细节的显示状态。触发操作可以例如拖拽操作等。
在一些实施例中,所述方法还包括,响应于针对所述第三控件的拖拽操作,按照所述第三控件对应的调节参数对所述第一对象或第二对象的第一细节进行调节。例如图4H所示,拖拽操作可以为同时缩短尺寸控件341a的调节条和垂直旋转控件341b的调节条。按照尺寸控件341a的调节条和垂直旋转控件341b的调节条来调节第一细节313(例如细节1)的大小和显示角度,并显示调节后(例如变小和具有一定垂直旋转角度)的第一细节313(例如细节1)。
假设用户希望切换对象显示区的第一细节313时,用户可以通过触控外围装置(例如usb触摸屏)对操作面板区130的第一操作区330进行触发操作,来显示第一对象(例如物体1)或第二对象(例如物体2或物体3)的第二细节。触发操作可以例如滑动操作等。
在一些实施例中,还包括响应于针对所述操作面板区的第三滑动操作,将所述第二细节控件进行高亮显示;以及在所述对象显示区中展示所述第一对象或所述第二对象的第二细节。如图4I所示,第三滑动操作可以为对图4H中饼状区域的上方进行自右向左的滑动操作,这样,可以在饼状区域(例如第二控件的上方)将第二细节控件324a(例如细节2控件)切换至第二控件上方的中间位置325处,并在对象显示区中将展示第二对象(例如物体2)的第一细节313(例如图4H中的细节1)切换为展示所述第二对象(例如物体2)的第二细节314a(例如4I中的细节2)。
接着,当用户希望将第二细节314a切换回第一细节313时,第三滑动操作可以为自左向右的滑动操作。例如,可以在图4I中的饼状操作区(例如第二控件的上方),将第一细节控件323(例如图4I所示)自右向左滑动,切换回第二控件上方的中间位置325处(例如图4H所示),并在对象显示区1中将展示所述第二对象(例如物体2)的第二细节314a(例如4I中的细节2)切换为展示所述第二对象(例如物体2)的第一细节313(例如4H中的细节1)。
假设用户希望切换对象显示区的所述第一对象或所述第二对象的细节为第一对象或第二对象时,用户可以通过触控外围装置(例如usb触摸屏)对操作面板区的第一操作区进行触发操作,来在所述操作面板区隐藏所述第一细节控件和所述第二细节控件。触发操作可以例如第四滑动操作等。
在一些实施例中,所述方法还包括,响应于针对所述操作面板区的第四滑动操作,在所述操作面板区隐藏所述第一细节控件和所述第二细节控件。例如,可以对图4I中处于中间位置325的第二控件(例如物体2控件)上方的细节控件处进行向下滑动(也即下拉)的弧形手势操作,即可将第二控件上方显示的与所述第二对象(例如物体2)对应的第一细节控件323和所有第二细节控件隐藏,并在对象显示区将显示细节2切换为显示物体2(例如图4E所示)。
需要说明的是,本申请实施例的方法可以由单个设备执行,例如一台计算机或服务器等。本实施例的方法也可以应用于分布式场景下,由多台设备相互配合来完成。在这种分布式场景的情况下,这多台设备中的一台设备可以只执行本申请实施例的方法中的某一个或多个步骤,这多台设备相互之间会进行交互以完成所述的方法。
需要说明的是,上述对本申请的一些实施例进行了描述。其它实施例在所附权利要求书的范围内。在一些情况下,在权利要求书中记载的动作或步骤可以按照不同于上述实施例中的顺序来执行并且仍然可以实现期望的结果。另外,在附图中描绘的过程不一定要求示出的特定顺序或者连续顺序才能实现期望的结果。在某些实施方式中,多任务处理和并行处理也是可以的或者可能是有利的。
基于同一发明构思,与上述任意实施例方法相对应的,本申请还提供了一种基于光场显示装置的交互装置。图5示出了本公开实施例所提供的示例性装置400的结构示意图。
参考图5,所述交互装置400,可以包括:
显示模块402,被配置为:响应于针对操作面板区的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区中的显示位置;以及
响应于所述第二控件处于所述操作面板区中的目标位置,将所述对象显示区中的所述第一对象切换显示为所述第二对象。
应当理解的是,本申请实施例的交互装置还可以包括,手势操作模块404和运算处理模块406。其具体工作流程图可以入图6所示。其中,手势操作模块404主要负责获取用户的手势信息作为系统输入,并对用户的手势信息进行记录并反馈给运算处理模块。具体可以通过在外围装置例如触摸屏(例如usb触摸屏和touchpad等)或者触摸板(例如游戏手柄等)等触控设备上通过检测用户点击的起始位置,引入光场显示装置电子设备(Android sdk)提供GestureDetector接口,并通过该接口识别用户输入的手势,例如通过onTouchEvent(event)方法完成不同手势的识别。
运算处理模块406主要负责接收手势操作模块404所传输的用户手势信息,包括手势类别及手势起始位姿等,通过对这些信息的处理控制显示模块402的界面显示内容。例如,将touchpad上的手势识别结果系统中的对象交互关系进行对应,实现对对象进行控制,将处理结果返回至显示模块,显示模块根据处理结果进行相应的显示对象切换等。同时运算处理模块也会根据手势识别的处理结果来更新操作面板区的控件排列方向(也即UI布局),并将处理结果返回至显示模块,显示模块根据处理结果进行相应的操作面板区的显示切换等。
在一些实施例中,显示模块402,还被配置为:响应于所述第二控件处于所述操作面板区中的目标位置,对所述第二控件进行高亮显示。
在一些实施例中,所述操作面板区包括第一操作区,所述第一控件和所述第二控件位于所述第一操作区,所述第一滑动操作是针对所述第一操作区的滑动操作;所述操作面板区包括第二操作区,所述第二操作区包括至少一个用于调节所述对象显示区中显示的第一对象或第二对象的第三控件。显示模块402,还被配置为:响应于针对所述第三控件的拖拽操作,按照所述第三控件对应的调节参数对所述第一对象或第二对象进行调节。
在一些实施例中,所述调节参数包括垂直角度、水平角度、对象大小中的至少其一。显示模块402,还被配置为:响应于针对处于所述目标位置的第一控件或第二控件的第二滑动操作,在所述操作面板区显示与所述第一对象或所述第二对象对应的第一细节控件和第二细节控件。
在一些实施例中,在所述操作面板区显示与所述第一对象或所述第二对象对应的第一细节控件和第二细节控件,显示模块402,还被配置为:
按照所述第一细节控件和所述第二细节控件的排列顺序,沿第一方向排列所述第一细节控件和所述第二细节控件。
在一些实施例中,显示模块402,还被配置为:根据所述第一细节控件和所述第二细节控件对应的查看频率,设置所述排列顺序。
在一些实施例中,显示模块402,还被配置为:响应于确定所述第一细节控件排列在前,将所述第一细节控件进行高亮显示;以及
在所述对象显示区中展示所述第一对象或所述第二对象的第一细节。
在一些实施例中,显示模块402,还被配置为:响应于针对所述操作面板区的第三滑动操作,将所述第二细节控件进行高亮显示;以及
在所述对象显示区中展示所述第一对象或所述第二对象的第二细节。
在一些实施例中,显示模块402,还被配置为:响应于针对所述操作面板区的第四滑动操作,在所述操作面板区隐藏所述第一细节控件和所述第二细节控件。
为了描述的方便,描述以上装置时以功能分为各种模块分别描述。当然,在实施本申请时可以把各模块的功能在同一个或多个软件和/或硬件中实现。
上述实施例的装置用于实现前述任一实施例中相应的基于光场显示装置的交互方法200,并且具有相应的方法实施例的有益效果,在此不再赘述。
基于同一发明构思,与上述任意实施例方法相对应的,本申请还提供了一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使所述计算机执行如上任一实施例所述的基于光场显示装置的交互方法200。
本实施例的计算机可读介质包括永久性和非永久性、可移动和非可移动媒体可以由任何方法或技术来实现信息存储。信息可以是计算机可读指令、数据结构、程序的模块或其他数据。计算机的存储介质的例子包括,但不限于相变内存(PRAM)、静态随机存取存储器(SRAM)、动态随机存取存储器(DRAM)、其他类型的随机存取存储器(RAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、快闪记忆体或其他内存技术、只读光盘只读存储器(CD-ROM)、数字多功能光盘(DVD)或其他光学存储、磁盒式磁带,磁带磁磁盘存储或其他磁性存储设备或任何其他非传输介质,可用于存储可以被计算设备访问的信息。
上述实施例的存储介质存储的计算机指令用于使所述计算机执行如上任一实施例所述的基于光场显示装置的交互方法200,并且具有相应的方法实施例的有益效果,在此不再赘述。
基于同一发明构思,与上述任意实施例方法相对应的,本申请还提供了一种计算机程序产品,所述计算机程序产品在计算机上运行时,计算机可以执行如上任一实施例所述的具有折叠屏的电子设备的显示调节方法。
所属领域的普通技术人员应当理解:以上任何实施例的讨论仅为示例性的,并非旨在暗示本申请的范围(包括权利要求)被限于这些例子;在本申请的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,步骤可以以任意顺序实现,并存在如上所述的本申请实施例的不同方面的许多其它变化,为了简明它们没有在细节中提供。
另外,为简化说明和讨论,并且为了不会使本申请实施例难以理解,在所提供的附图中可以示出或可以不示出与集成电路(IC)芯片和其它部件的公知的电源/接地连接。此外,可以以框图的形式示出装置,以便避免使本申请实施例难以理解,并且这也考虑了以下事实,即关于这些框图装置的实施方式的细节是高度取决于将要实施本申请实施例的平台的(即,这些细节应当完全处于本领域技术人员的理解范围内)。在阐述了具体细节(例如,电路)以描述本申请的示例性实施例的情况下,对本领域技术人员来说显而易见的是,可以在没有这些具体细节的情况下或者这些具体细节有变化的情况下实施本申请实施例。因此,这些描述应被认为是说明性的而不是限制性的。
尽管已经结合了本申请的具体实施例对本申请进行了描述,但是根据前面的描述,这些实施例的很多替换、修改和变型对本领域普通技术人员来说将是显而易见的。例如,其它存储器架构(例如,动态RAM(DRAM))可以使用所讨论的实施例。
本申请实施例旨在涵盖落入所附权利要求的宽泛范围之内的所有这样的替换、修改和变型。因此,凡在本申请实施例的精神和原则之内,所做的任何省略、修改、等同替换、改进等,均应包含在本申请的保护范围之内。

Claims (15)

  1. 一种基于光场显示装置的交互方法,其特征在于,所述光场显示装置的显示区域包括对象显示区和操作面板区,所述对象显示区中包括第一对象,所述操作面板区包括与所述第一对象和第二对象分别对应的第一控件和第二控件,所述方法包括:
    响应于针对所述操作面板区的第一滑动操作,切换所述第一控件和所述第二控件的在所述操作面板区中的显示位置;以及
    响应于所述第二控件处于所述操作面板区中的目标位置,将所述对象显示区中的所述第一对象切换显示为所述第二对象。
  2. 根据权利要求1所述的方法,其特征在于,还包括:
    响应于所述第二控件处于所述操作面板区中的目标位置,对所述第二控件进行高亮显示。
  3. 根据权利要求1所述的方法,其特征在于,所述操作面板区包括第一操作区,所述第一控件和所述第二控件位于所述第一操作区,所述第一滑动操作是针对所述第一操作区的滑动操作。
  4. 根据权利要求3所述的方法,其特征在于,所述操作面板区包括第二操作区,所述第二操作区包括至少一个用于调节所述对象显示区中显示的第一对象或第二对象的第三控件,所述方法还包括:
    响应于针对所述第三控件的拖拽操作,按照所述第三控件对应的调节参数对所述第一对象或第二对象进行调节。
  5. 根据权利要求4所述的方法,其特征在于,所述调节参数包括垂直角度、水平角度、对象大小中的至少其一。
  6. 根据权利要求1所述的方法,其特征在于,还包括:
    响应于针对处于所述目标位置的第一控件或第二控件的第二滑动操作,在所述操作面板区显示与所述第一对象或所述第二对象对应的第一细节控件和第二细节控件。
  7. 根据权利要求6所述的方法,其特征在于,在所述操作面板区显示与所述第一对象或所述第二对象对应的第一细节控件和第二细节控件,进一步包括:
    按照所述第一细节控件和所述第二细节控件的排列顺序,沿第一方向排列所述第一细节控件和所述第二细节控件。
  8. 根据权利要求7所述的方法,其特征在于,还包括:
    根据所述第一细节控件和所述第二细节控件对应的查看频率,设置所述排列顺序。
  9. 根据权利要求7所述的方法,其特征在于,还包括:
    响应于确定所述第一细节控件排列在前,将所述第一细节控件进行高亮显示;以及
    在所述对象显示区中展示所述第一对象或所述第二对象的第一细节。
  10. 根据权利要求9所述的方法,其特征在于,还包括:
    响应于针对所述操作面板区的第三滑动操作,将所述第二细节控件进行高亮显示;以及
    在所述对象显示区中展示所述第一对象或所述第二对象的第二细节。
  11. 根据权利要求6所述的方法,其特征在于,还包括:
    响应于针对所述操作面板区的第四滑动操作,在所述操作面板区隐藏所述第一细节控件和所述第二细节控件。
  12. 一种基于光场显示装置的交互装置,其特征在于,包括:
    显示模块,被配置为:响应于针对操作面板区的第一滑动操作,切换第一控件和第二控件的在操作面板区中的显示位置;以及
    响应于第二控件处于操作面板区中的目标位置,将对象显示区中的第一对象切换显示为第二对象。
  13. 一种光场显示装置,其特征在于,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如权利要求1至12任意一项所述的方法。
  14. 一种非暂态计算机可读存储介质,所述非暂态计算机可读存储介质存储计算机指令,所述计算机指令用于使计算机执行权利要求1至12任一所述方法。
  15. 一种计算机程序产品,包括计算机程序指令,当所述计算机程序指令在计算机上运行时,使得计算机执行如权利要求1至12中任一项所述的方法。
PCT/CN2023/094009 2022-05-31 2023-05-12 基于光场显示装置的交互方法及相关设备 WO2023231749A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210629333.5A CN117193585A (zh) 2022-05-31 2022-05-31 基于光场显示装置的交互方法及相关设备
CN202210629333.5 2022-05-31

Publications (2)

Publication Number Publication Date
WO2023231749A1 WO2023231749A1 (zh) 2023-12-07
WO2023231749A9 true WO2023231749A9 (zh) 2024-02-29

Family

ID=88992982

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/094009 WO2023231749A1 (zh) 2022-05-31 2023-05-12 基于光场显示装置的交互方法及相关设备

Country Status (2)

Country Link
CN (1) CN117193585A (zh)
WO (1) WO2023231749A1 (zh)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111324250B (zh) * 2020-01-22 2021-06-18 腾讯科技(深圳)有限公司 三维形象的调整方法、装置、设备及可读存储介质
JP7391448B2 (ja) * 2020-11-19 2023-12-05 ▲騰▼▲訊▼科技(深▲セン▼)有限公司 仮想オブジェクトの制御方法、装置、機器、記憶媒体及びコンピュータプログラム製品
CN113093971B (zh) * 2021-04-15 2022-11-01 网易(杭州)网络有限公司 对象的显示控制方法及装置
CN113680053B (zh) * 2021-08-30 2023-06-13 腾讯科技(深圳)有限公司 虚拟对象的控制方法、装置、终端及存储介质

Also Published As

Publication number Publication date
WO2023231749A1 (zh) 2023-12-07
CN117193585A (zh) 2023-12-08

Similar Documents

Publication Publication Date Title
US20220253185A1 (en) Hybrid systems and methods for low-latency user input processing and feedback
JP6659644B2 (ja) 応用素子の代替的グラフィック表示の事前の生成による入力に対する低レイテンシの視覚的応答およびグラフィック処理ユニットの入力処理
US10564792B2 (en) Display device and method of indicating an active region in a milti-window display
TWI528266B (zh) 電子裝置及螢幕內容分享方法
US11604580B2 (en) Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device
US20210049038A1 (en) Display of combined first and second inputs in combined input mode
US9830014B2 (en) Reducing control response latency with defined cross-control behavior
WO2022007934A1 (zh) 应用图标控制方法、装置及电子设备
US20160328149A1 (en) Display screen-based physical button simulation method and electronic device
EP4250076A1 (en) Method and apparatus for adjusting interface display state, and device and storage medium
CN104808909B (zh) 一种信息处理方法及电子设备
CN109522082A (zh) 应用软件的界面跳转方法、装置、电子设备和介质
CN113648649B (zh) 游戏界面的控制方法、装置、计算机可读介质及终端设备
WO2023231749A9 (zh) 基于光场显示装置的交互方法及相关设备
US11328693B2 (en) Image display device, method, medium and electronic device based on mobile terminal
CN111522487A (zh) 一种触摸显示产品的图像处理方法、装置、存储介质及电子设备
TW202016694A (zh) 資料處理方法、裝置和機器可讀媒體
CN104820489B (zh) 管理低延时的直接控制反馈的系统和方法
TWI509505B (zh) tablet
TWI840315B (zh) 顯示系統和顯示方法
EP3019943A1 (en) Reducing control response latency with defined cross-control behavior
TWM470292U (zh) 平板電腦

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23814934

Country of ref document: EP

Kind code of ref document: A1