CN116954534A - Mobile display method, medium, program product and electronic equipment - Google Patents

Mobile display method, medium, program product and electronic equipment Download PDF

Info

Publication number
CN116954534A
CN116954534A CN202210400220.8A CN202210400220A CN116954534A CN 116954534 A CN116954534 A CN 116954534A CN 202210400220 A CN202210400220 A CN 202210400220A CN 116954534 A CN116954534 A CN 116954534A
Authority
CN
China
Prior art keywords
electronic device
vehicle
page
parameters
screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210400220.8A
Other languages
Chinese (zh)
Inventor
邢海峰
龙水平
孙瑞
郜文美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210400220.8A priority Critical patent/CN116954534A/en
Publication of CN116954534A publication Critical patent/CN116954534A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application relates to the technical field of communication, and discloses a mobile display method, a medium, a program product and electronic equipment, which can relieve motion sickness symptoms without affecting the content originally displayed on a screen. The method comprises the following steps: displaying a first page on a screen of the electronic device; acquiring motion parameters, wherein the motion parameters are acceleration and/or angular rate; if the motion parameters meet the first condition, displaying a target object on the first page according to the target parameters in a superposition mode according to the color parameters of the first page; and adjusting the target parameters according to the motion parameters, and controlling the dynamic display of the target object. The method can be applied to a scene in which a user browses content displayed on a screen of an electronic device while riding a vehicle.

Description

Mobile display method, medium, program product and electronic equipment
Technical Field
The present application relates to the field of communications technologies, and in particular, to a mobile display method, a medium, a program product, and an electronic device.
Background
With the development of technology, automobiles are an important movable space for people to live and work, for example, people can browse electronic screens, temporarily work, etc. in riding automobiles. In the driving process of the automobile, jolt, a plurality of times of starting, a user browsing an electronic screen and the like can cause the user to easily generate motion sickness ("motion sickness") and influence the driving experience of the user. An important cause of motion sickness is: when the automobile starts, accelerates and decelerates and brakes, the vestibular device of the inner ear senses the speed change, and the objects in the automobile watched by eyes do not move, for example, pictures such as pictures or videos displayed on an electronic screen do not move, so that the inner ear and brain information are asynchronous, and dizziness occurs.
In order to relieve the sense of dizziness during driving, a driver or a passenger can be reminded not to read through a screen, a light bar or a sound under the condition that the vehicle is accelerated, decelerated or jolted. At this time, if the user still continues to browse the picture on the screen, the motion sickness symptoms caused by browsing the screen will not be alleviated.
Disclosure of Invention
The embodiment of the application provides a mobile display method, a medium, a program product and electronic equipment, which can relieve motion sickness symptoms without affecting the content originally displayed on a screen.
In a first aspect, an embodiment of the present application provides a mobile display method, which is applied to an electronic device, and includes: displaying a first page on a screen of the electronic device; acquiring motion parameters, wherein the motion parameters are acceleration and/or angular rate; if the motion parameters meet a first condition, displaying a target object on the first page according to the color parameters of the first page in a superposition mode according to target parameters; and adjusting the target parameters according to the motion parameters, and controlling the dynamic display of the target object. Wherein the target object may be a moving image, a moving doll or a vehicle three-dimensional animation hereinafter. As one example, in a mobile scene, an electronic device may superimpose a rendering window on a page of a screen display to display a target object such as a moving image or an animation through the rendering window. And, with acceleration, deceleration, turning, the electronic apparatus can control dynamic display of moving images, animations, or the like according to acceleration or angular rate. For example, the target object is contracted as the magnitude of the value of the acceleration increases during acceleration of the vehicle, simulating an effect away from the user. The target object is enlarged with the increase of the acceleration amplitude during the deceleration of the vehicle, simulating the effect of approaching the user. Therefore, dizziness caused by mismatching of the vision and the inner ear vestibule to the speed perception can be relieved, and meanwhile, the content originally displayed on the screen can not be influenced, so that a user can normally view the content displayed on the screen.
In a possible implementation manner of the first aspect, the target parameter includes at least one of: color parameters, shape parameters, size parameters, position parameters, quantity, transparency, dynamic parameters indicating dynamic effects, movement parameters indicating movement speed and direction.
In a possible implementation manner of the first aspect, before the displaying the target object according to the target parameter in a superimposed manner on the first page according to the color parameter of the first page, the method further includes: and determining the position parameters of the target object according to the horizontal and vertical screen states of the screen of the electronic equipment.
In a possible implementation manner of the first aspect, the target object is a preset image or an image generated according to content in the page.
In a possible implementation manner of the first aspect, before the displaying the target object according to the target parameter in a superimposed manner on the first page according to the color parameter of the first page, the method further includes: performing screen capturing operation on the first page to obtain a first image; and acquiring the color parameter of the first image as the color parameter of the first page.
In a possible implementation manner of the first aspect, after the taking the color parameter of the first image as the color parameter of the first page, the method further includes: acquiring color parameters of the target object under the condition that the target object is the preset image; the step of superposing and displaying the target object on the first page according to the target parameter according to the color parameter of the first page comprises the following steps: adjusting the color parameters of the target object to the color parameters of the first page; and displaying the target object on the first page in a superposition manner according to the target parameters. For example, the color parameters may include a background color, a theme color, and an auxiliary color. The adjusting the color parameter of the target object to the color parameter of the first page may be sequentially adjusting the background color, the theme color, and the auxiliary color of the target object to the background color, the theme color, and the auxiliary color of the first page.
In a possible implementation manner of the first aspect, after performing the screen capturing operation on the first page to obtain a first image, the method further includes: intercepting image content in a target area in the first image to obtain a second image; and taking the second image as the target object.
In a possible implementation manner of the first aspect, the target object is a three-dimensional animation.
In a possible implementation manner of the first aspect, the color parameter motion parameters of the first page further include: at least one of direction information, position information and road network information.
In a possible implementation manner of the first aspect, the motion parameter is a motion parameter detected by the electronic device or a motion parameter acquired by the electronic device from the vehicle.
In a possible implementation manner of the first aspect, after the adjusting the target parameter according to the motion parameter and controlling the dynamic display of the target object, the method further includes: and if the motion parameter does not meet the first condition, canceling to display the target object.
In a possible implementation manner of the first aspect, the first condition is that a value of the acquired acceleration is greater than or equal to a first preset threshold value.
In a possible implementation manner of the first aspect, the motion parameter includes an acceleration, and the size parameter of the target object is adjusted according to the acceleration.
In a possible implementation manner of the first aspect, as a focal point of a line of sight of a user on a screen of the electronic device changes, a size parameter of the first page is determined according to a distance of translation of the electronic device, a rotation direction of the first page is opposite to a rotation direction of the screen of the electronic device, and a rotation angle of the first page is the same as a rotation angle of the screen of the electronic device.
In a possible implementation manner of the first aspect, before the acquiring the motion parameter, the method further includes: establishing a communication connection with the vehicle; determining that the electronic equipment is in a bright screen state; determining the gesture of the electronic equipment as a preset gesture; acquiring position information and speed information, wherein the position information and the speed information are detected by the electronic equipment or acquired by the electronic equipment from the vehicle; and determining that the position information and the speed information meet a third condition.
In a second aspect, embodiments of the present application provide a computer readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the mobile display method according to the first aspect.
In a third aspect, embodiments of the present application provide a computer program product comprising instructions for implementing the mobile display method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a memory for storing instructions for execution by one or more processors of an electronic device, and a processor for performing the mobile display method as described in the first aspect when the instructions are executed by the one or more processors.
Drawings
FIG. 1A is a schematic illustration of an application scenario of a mobile display method according to some embodiments of the present application;
FIGS. 1B-1D provide schematic diagrams of electronic device display content of a vehicle in a mobile scenario, according to some embodiments of the application;
FIG. 1E is a schematic illustration of an application scenario of a mobile display method according to some embodiments of the present application;
FIG. 1F is a schematic diagram of a vehicle displaying content of an electronic device in a mobile scenario, according to some embodiments of the application;
FIG. 2 is a flow chart of a mobile display method according to some embodiments of the present application;
FIG. 3 is a flow chart of adjusting target parameters according to some embodiments of the application;
FIGS. 4A-4B provide a schematic illustration of a vehicle displaying content in an electronic device in a mobile scenario, according to some embodiments of the application;
FIG. 5A is a schematic illustration of a moving image being provided that will rotate along coordinate axes according to some embodiments of the application;
FIGS. 5B and 5C provide a schematic diagram of a vehicle displaying content on an electronic device in a mobile scenario, according to some embodiments of the application;
FIG. 6 is a flow chart of a method for adjusting color parameters according to some embodiments of the application;
fig. 7 is a schematic flow chart of a method for starting a motion sickness prevention function according to some embodiments of the present application;
FIG. 8 is a schematic diagram of a prompt message displayed by an electronic device according to some embodiments of the present application;
FIG. 9 is a schematic diagram of a prompt message displayed by an electronic device according to some embodiments of the present application;
FIG. 10 is a flow chart of a mobile display method according to some embodiments of the present application;
FIG. 11 is a flow chart of a mobile display method according to some embodiments of the present application;
FIG. 12 is a schematic diagram of an electronic device displaying a flow associated with a three-dimensional animation of a vehicle, according to some embodiments of the application;
FIG. 13 provides a schematic representation of the relative pose transformation of a human eye and an electronic device according to some embodiments of the application;
FIG. 14 is a flow chart of a mobile display method according to some embodiments of the present application;
fig. 15 is a schematic flow chart of determining a focus of line of sight by interaction of an electronic device with a wearable device 3 according to some embodiments of the application;
FIG. 16 provides a schematic representation of a change in pose of an electronic device according to some embodiments of the application;
FIG. 17 provides a schematic diagram of a page change caused by a pose change of an electronic device, according to some embodiments of the application;
FIG. 18 provides a schematic structural view of a vehicle according to some embodiments of the present application;
fig. 19 provides a schematic structural diagram of an electronic device according to some embodiments of the present application.
Detailed Description
Embodiments of the present application include, but are not limited to, a mobile display method, medium, program product, and electronic device.
The mobile display method provided by the application can be applied to automatic driving or non-automatic driving scenes. The method and the device can be applied to scenes in which a user browses contents displayed on a screen while the vehicle is running.
However, viewing the screen by the user during the travel of the vehicle may produce symptoms of motion sickness. The motion sickness phenomenon is caused by mismatching between the inner ear vestibuler and vision of a user with the perception of speed when the vehicle starts, accelerates and decelerates and brakes, and by drastic change of the focus of the sight caused by jolt of the vehicle.
In order to solve the motion sickness problem, in the scene of using the electronic equipment when a user takes a car, after the car starts to run, the electronic equipment can superimpose a rendering window on a page displayed on a screen so as to display objects such as moving images or animations through the rendering window. In addition, as the vehicle accelerates and decelerates and turns, the electronic device can control dynamic effects such as moving images and animations to dynamically change according to inertial parameters such as acceleration and angular velocity corresponding to the vehicle. For example, a moving image is reduced as the magnitude of the value of the acceleration increases during acceleration of the vehicle, simulating an effect of being away from the user. The moving image is enlarged as the magnitude of the acceleration increases during deceleration of the vehicle, simulating the effect of approaching the user. Similarly, the moving image is curved leftward as the angular rate increases during left turning, and curved rightward as the angular rate increases during right turning. Thus, the inner ear vestibule of the user perceives the speed change of the vehicle, and the eyes of the user also perceive the dynamically displayed moving image or animation and the speed change synchronized with the vehicle. Therefore, dizziness caused by mismatching of the vision and the inner ear vestibule to the speed perception can be relieved, and meanwhile, the content originally displayed on the screen can not be influenced, so that a user can normally view the content displayed on the screen. In addition, the application can also utilize the wearable equipment or the existing technology such as the depth camera based on the structured light to track the sight line so as to determine the sight focus of eyes on the screen, and the content displayed on the screen is moved so that the content under the sight focus of the user is unchanged. Therefore, the relatively stable sight focus between the human eyes and the screen is realized, and the dizziness caused by the change of the sight focus during jolt or shake is solved.
In some embodiments, the screen related to the mobile display method provided by the application may be a screen of a display device of a vehicle, or may be a screen of an electronic device such as a mobile terminal used by a user. The electronic equipment used by the user is fixed on the bracket and is fixedly connected with the vehicle, or the electronic equipment can be held by the user for use.
In some embodiments, an electronic device suitable for use with the present application may be any electronic device having a display screen, including but not limited to a wearable device (e.g., a smart watch), a headset, a cell phone, a tablet, a personal computer, a server computer, a handheld or laptop device, a mobile device (e.g., a mobile phone, a Personal Digital Assistant (PDA), a media player, etc.), a mini-computer, and so forth. The electronic device is preferably a portable personal mobile terminal so that a user seated in the vehicle views contents displayed on the screen through the portable personal mobile terminal.
In addition, in some embodiments, the vehicle applicable to the present application may be a car, a truck, a motorcycle, a bus, a ship, an airplane, a helicopter, a mower, a recreational vehicle, a casino vehicle, construction equipment, an electric car, a golf car, a train, a trolley, or the like, and embodiments of the present application are not particularly limited.
Referring to fig. 1A, a schematic diagram of an application scenario of a mobile display method according to an embodiment of the present application is shown. During the running of the vehicle 1, the user browses the content displayed on the screen of the electronic device 2. It can be understood that the scenario shown in fig. 1A is only shown with the vehicle 1 as a car and the electronic device 2 as a mobile phone, but the practical application scenario provided by the present application is not limited.
Based on the scene shown in fig. 1A, referring to fig. 1B, a schematic diagram of displaying contents of an electronic device in a moving scene of a vehicle is shown. In a scene where the user browses the electronic device 2 while riding the vehicle 1, the screen of the electronic device 2 displays a normal page 21a (noted as an initial page) as shown in the left side of fig. 1B when the vehicle 1 is not started to stop or is traveling at a low acceleration or at a constant speed. When the vehicle 1 turns on the motion sickness prevention function and the acceleration is large, a moving image of which the rendering window displays the moire effect is superimposed on the screen-displayed page 21B of the electronic device 2 as shown on the right side of fig. 1B. Specifically, fig. 1B shows that two long edges of the screen are respectively overlapped with a rectangular rendering window to display a moving image with a water ripple effect, and in addition, the two moving images and the color of the current page 21B on the screen are unified and coordinated with each other, and have certain transparency, so that the contents on the normal page are not blocked. The self-adaptive rendering of the page normally displayed by the screen is realized. Further, the two-part dynamic moving image may be dynamically displayed with start-stop, acceleration and deceleration, etc. of the vehicle 1, so that the content on the screen seen by the human eye dynamically changes with the movement of the vehicle 1, for example, the moving image increases in size with acceleration of the vehicle 1, and the moving image reversely bends in an arc shape along a turn, etc. with turning of the vehicle 1. The motion sickness symptom of the user browsing the screen in the moving scene is relieved or eliminated, and meanwhile, the content in the page normally displayed on the screen is not affected.
Based on the scene shown in fig. 1A, referring to fig. 1C, a schematic diagram of displaying contents of an electronic device in a moving scene of a vehicle is shown. As shown in fig. 1C, in a scene where the user browses the electronic device 2 by riding the vehicle 1, the screen of the electronic device 2 displays a normal page 23a (noted as an initial page) as shown on the left side of fig. 1C when the vehicle 1 is not started to stop or is traveling at a constant speed or the acceleration is small. When the vehicle 1 turns on the motion sickness prevention function and the acceleration is large, the motion doll 23C and the like are displayed superimposed on the screen-displayed page 23b of the electronic device 2 as shown on the right side of fig. 1C. The motion doll can dynamically move along with starting, stopping, acceleration, deceleration and the like of the vehicle 1, so that the motion of the motion doll 23c on the screen seen by human eyes dynamically changes along with the motion of the vehicle 1, and motion sickness symptoms of a user browsing the screen in a moving scene are relieved or eliminated.
Based on the scene shown in fig. 1A, referring to fig. 1D, a schematic diagram of displaying contents of an electronic device in a moving scene of a vehicle is shown. In a scene where the user browses the electronic device 2 while riding the vehicle 1, the screen of the electronic device 2 displays a normal page 24a (noted as an initial page) as shown on the left side of fig. 1D when the vehicle 1 is not started to stop or is traveling at a constant speed. When the vehicle 1 turns on the motion sickness prevention function and the acceleration is large, a vehicle three-dimensional (3-dimensional) animation 24c is displayed superimposed on the screen-displayed page 24b of the electronic device 2 as shown on the right side of fig. 1D. The three-dimensional animation 24c of the vehicle can be dynamically displayed along with starting and stopping, acceleration and deceleration, turning and the like of the vehicle 1, so that the motion of the three-dimensional animation 24c of the vehicle on a screen seen by human eyes dynamically changes along with the motion of the vehicle 1, and the motion sickness symptom of a user browsing the screen in a moving scene is relieved or eliminated.
In addition, referring to fig. 1E, another application scenario diagram of a mobile display method according to an embodiment of the present application is provided. During the running of the vehicle 1, the user wears the head-mounted wearable device 3 to browse the content displayed on the screen of the electronic device 2. It can be understood that the scenario shown in fig. 1 is only shown by taking the vehicle 1 as an automobile, the electronic device 2 as a mobile phone, and the wearable device 3 as smart glasses as an example, but the practical application scenario provided by the present application is not limited.
As an example, the wearable device 3 in the present application may also be a Virtual Reality (VR) helmet or the like device. Wherein the wearable device 3 may be used to determine a visual focus of the user and interact with the electronic device 2 such that the electronic device 2 determines what is displayed in the user's visual focus.
In addition, based on the scene shown in fig. 1A or 1E, referring to fig. 1F, a schematic diagram of the content displayed by the electronic device in the scene where jolting occurs during the running of the vehicle is shown. As shown in fig. 1F, in a scene where the user browses the electronic device 2 by riding the vehicle 1, the screen of the electronic device 2 displays a normal page 25a (denoted as an initial page) as shown in the left side of fig. 1F before jolting occurs in the vehicle 1 traveling through the vehicle, and at this time, the content of the focus P0 of the line of sight of the user on the page 25a is the center of the text box 12. When the vehicle 1 runs and jolts, the sight focus of the user is greatly changed, the electronic device 2 performs sight tracking, or the electronic device 2 interacts with the wearable device 3 to perform sight tracking, so as to determine that the sight focus is changed to be the sight focus P1 on the page 25b displayed on the screen of the electronic device 2 as shown in the upper part of fig. 1F, and obviously, the content under the sight focus is no longer the image 12. Then, the electronic apparatus 2 determines a change in the line-of-sight focus by performing line-of-sight tracking, and moves the page 25c displayed on the screen so that the text box 12 where the original line-of-sight focus P0 is located moves below the line-of-sight focus P1 in the page 25c displayed on the right side of fig. 1F. Therefore, the sight focus of the user and the content displayed on the screen are relatively stable, so that the motion sickness symptoms caused by severe changes of the sight focus caused by factors such as bumping of the vehicle 1 are relieved or eliminated, and the experience of browsing the screen when the user drives is further improved. It will be appreciated that in practical application of the present application, the electronic device 2 may not display the page 25b above the fig. 1F, but directly display the page 25c shown on the right side of the fig. 1F when the vehicle 1 jolts, so as to keep the content in the focus of the user's vision stable.
Example 1
The mobile display method provided by the application is described in detail below based on the application scenario shown in fig. 1A. Reference is made to fig. 2, which is a schematic flow chart of a mobile display method. The main execution body of the method shown in fig. 2 may be an electronic device 2, where the electronic device 2 is a portable electronic device such as a mobile phone held by a user sitting in the vehicle 1, and the method specifically includes the following steps:
s201: and starting the anti-carsickness function.
In some embodiments, the electronic device 2 may turn on the anti-motion sickness function in response to a manual operation of the user, or automatically turn on the anti-motion sickness function intelligently based on whether the vehicle 1 starts traveling and whether the user browses a screen of the electronic device 2, so as to avoid a waste of computing resources caused by blind turn-on of the anti-motion sickness function.
The specific manner in which the electronic device 2 manually or automatically opens the motion sickness preventing function will be described in detail in S201a to S201 d shown in fig. 7 below, and will not be described here.
In some embodiments, the timing of the electronic device 2 to turn on the motion sickness prevention function may be before the user sits on the vehicle or while the user sits on the vehicle, which is not limited by the present application.
And S202, acquiring acceleration and angular velocity.
In some embodiments, the electronic device 2 may obtain the acceleration and angular rate of the electronic device 2. In other embodiments, the electronic device 2 may acquire acceleration and acceleration rate of the vehicle 1.
In some embodiments, an inertial measurement unit (Inertial measurement unit, IMU) is provided in the electronic device 2, the IMU including an accelerometer therein for detecting acceleration of the electronic device 2 to obtain a speed and a position of the electronic device 2; a gyroscope for detecting the angular rate of the electronic device 2. In addition, a positioning chip or the like using a global positioning system (Global Positioning System, GPS) technology is also provided in the electronic device 2, and the positioning chip is used for detecting position information of the electronic device 2, such as location service (LBS, location based service) information, so as to correct the speed and position of the electronic device 2 according to the position information. It will be appreciated that the speed and position of the electronic device 2 may be used to determine whether the electronic device 2 is in a moving scene, such as in a running vehicle 1.
It will be appreciated that in a mobile scenario, the acceleration and angular rate of the electronic device 2 may generally characterize the acceleration and angular rate of the vehicle 1.
In other embodiments, an accelerometer and a gyroscope are included in an IMU provided in the vehicle 1 for detecting acceleration and angular rate of the vehicle 1, respectively. In addition, a positioning chip is also arranged in the vehicle 1 and is used for detecting the position information of the vehicle 1; and the magnetometer is used for testing the intensity and the direction of the magnetic field to obtain direction information (or heading information). Similarly, the position information of the vehicle 1 may also be used to correct the speed and position of the vehicle 1, and the direction information may be used to correct the angular velocity of the vehicle 1. Further, the electronic device 2 may establish a wireless communication connection with the vehicle 1, such as a wireless communication connection established through a specific application program, or a wireless communication connection established through a bluetooth function, to transmit the acceleration and the angular rate of the vehicle 1 to the electronic device 2 based on the wireless communication, so that the electronic device 2 acquires the acceleration and the angular rate of the vehicle 1.
S203: it is determined whether the acceleration and/or angular rate exceeds a corresponding first preset threshold (noted as a first condition). If yes, it indicates that the vehicle 1 is traveling and accelerating, and the speed is high, and it is generally indicated that the vehicle 1 is traveling on the road, the flow proceeds to S204. It will be appreciated that during actual driving, the vehicle 1 is typically not moving at a constant speed but at a variable speed, since the road is not perfectly flat. If not, it is indicated that the vehicle 1 is not running or running slowly and the speed is low without taking anti-motion sickness measures, and the process returns to S202.
The first preset threshold includes a threshold corresponding to acceleration and a threshold corresponding to angular rate. Further, the electronic apparatus 2 may determine whether the currently acquired acceleration exceeds a corresponding threshold value, or whether the currently acquired angular rate exceeds a corresponding threshold value, or perform both. In addition, the specific value of the first preset threshold is not limited, and can be selected according to actual conditions.
It will be appreciated that the electronic device 2 is mounted on a stand in the vehicle 1 and that the acceleration and acceleration rate of the electronic device 2 is generally considered to be substantially identical to that of the vehicle 1. If the user holds the electronic device 2, the gesture can be considered to be basically unchanged when browsing the electronic device 2, and although the gesture cannot be kept absolutely stable, the electronic device 2 can filter information such as small shake of the hand or jolt of the vehicle 1 through a filtering algorithm. That is, the electronic device 2 may filter the obtained data of the acceleration and the angular rate, so as to obtain the filtered more accurate data. Therefore, the obtained acceleration and angular velocity can accurately reflect the movement condition of the electronic equipment 2, and thus the running states of the vehicle 1 such as start and stop, turning and the like can be accurately reflected.
In addition, in other embodiments, when the vehicle 1 is accelerated or decelerated, for example, if the electronic device 2 determines that it is yes in S203, the electronic device 2 may also use a voice form or a text prompt message form to remind the user to see out of the window or adjust the sitting posture, so as to alleviate or eliminate the motion sickness feeling of the user.
S204: it is determined whether the acquired acceleration and/or angular rate satisfies a second condition.
The second condition indicates that the electronic device 2 is greatly swayed, and the vehicle 1 where the corresponding electronic device 2 is located is bumpy or fluctuated greatly. If so, it is indicated that the vehicle 1 is bumpy or is fluctuated (i.e., is fluctuated), then S205 is entered to alleviate or eliminate motion sickness symptoms caused by a drastic change in the focus of the line of sight when the user browses the screen of the electronic apparatus 2 in that scene.
If not, it is indicated that the vehicle 1 is not bumpy or is less fluctuated, then the process proceeds to S206 to alleviate or eliminate the motion sickness symptom caused by the inconsistent inner ear vestibule and visual perception of speed in the scene where the user browses the screen of the electronic device 2.
In some embodiments, the second condition may include the acceleration being greater than or equal to a corresponding second preset threshold, and/or the angular rate being greater than or equal to a corresponding second preset threshold. In addition, the second preset threshold corresponding to the acceleration is greater than the first preset threshold, the second preset threshold corresponding to the angular rate is also greater than the first preset threshold, and the specific value can be selected according to the actual situation.
In other embodiments, the second condition comprises: screening abnormal values exceeding a preset standard value range from a plurality of acceleration values; if the ratio of the number of abnormal values to the number of acceleration values exceeds a preset threshold value, the vehicle is shown to jolt or fluctuate greatly. The specific values of the standard value range and the preset threshold value may be set according to actual requirements, which are not limited herein.
It will be appreciated that the electronic device 2 normally displays the page content before executing S205 and S206, without affecting the normal browsing of the page content by the user in the mobile scenario.
S205: the content displayed on the screen is adjusted so that the content on the screen in the focus of the line of sight of the human eye is unchanged. Therefore, the dizziness caused by the change of the visual focus of human eyes when the scene is bumpy or swaying is solved.
Specifically, a gaze tracking technique may be used to track or compensate for changes in the pose of the electronic device 2 for a user's gaze focus to determine the gaze focus of the human eye on the screen.
In other embodiments, when the vehicle 1 is bumpy or rough, the electronic device 2 may change its pitch greatly, and the electronic device 2 may display a dynamic moving image superimposed on the page instead of performing the step of performing the line of sight tracking as in S205.
In addition, in other embodiments, in the case where the vehicle 1 bumps or fluctuates greatly, the electronic apparatus 2 performs the above-described line-of-sight tracking of S205 to alleviate symptoms of motion sickness, and may superimpose and display a dynamically changing moving image on the page.
It is understood that after the execution of S205 is completed, the mobile display method provided by the present application may be ended.
The process of tracking the gaze focus of the user or compensating for the pose change of the electronic device 2 by using the gaze tracking technology in S205 will be described in detail in S205a-S205c below, and will not be described here again.
In addition, in some embodiments, the electronic apparatus 2 superimposes the rendering window on the page currently displayed on the screen to display a dynamic moving image, so that the dynamic display effect of the moving image dynamically changes with the acquired acceleration or angular rate. Thus, the symptoms of motion sickness caused by inconsistent inner ear vestibule and vision-to-speed perception are thereby alleviated or eliminated.
As an example, the present application can display a moving image by performing the following S206 and S207 to superimpose rendering frames on a screen in the case where the vehicle 1 is fluctuated less or not fluctuated and the electronic apparatus 2 is fluctuated less.
S206: a target parameter corresponding to the moving image is determined.
In some embodiments, the moving image may be set by a user or set by default. For example, the setting application of the electronic device 2 may provide a setting interface of the motion sickness prevention function, and the setting interface may include therein a setting option related to a moving image for supporting the user to select the moving image through the corresponding setting option.
The moving image to be dynamically displayed includes a plurality of images, that is, an image sequence composed of a plurality of images. As an example, the moving image that is dynamically displayed may be a sequence of images obtained by processing a single image.
Specifically, the moving image includes any one of the following.
(1) An image of the plurality of preset images.
(2) User-defined images in gallery applications.
As an example, the moving images provided in the above-described modes (1) and (2) may be a moire image, a light shadow image, a tree shadow image, or the like, and the dynamic display of the subsequent moving image may simulate the effect of a natural scene, but is not limited thereto.
(3) An image generated from the content in the current page displayed on the screen of the electronic device 2.
The moving image may be a part of the image in the current page or a coordinated image generated based on the part of the image in the current page. For example, the moving image may be an image obtained by blurring an image of an edge portion of a current page picture (e.g., an image of a rectangular area content of a long side (or wide side) edge of a screen, which is denoted as a target area), but is not limited thereto.
In addition, the moving image may be an image sequence generated in real time according to a color parameter in a current page and obtained by processing the image. For example, a solid image may be generated according to the background color or the theme color or the auxiliary color in the current page, and the image may be processed to obtain an image sequence as a moving image for the dynamic display.
As an example, in the present application, the method for processing a single image into a group of image sequences may be that data of the single image is subjected to matrix transformation according to a preset rule to obtain an image sequence composed of a plurality of images, where the preset rule may be determined according to actual requirements, and the preset rule is not limited herein.
The above-mentioned target parameter is a parameter related to the display of the moving image on the screen by the electronic device 2, and the moving image is displayed in the rendering frame, and then the target parameter may also be described as a target parameter of the rendering frame.
The parameters in the target parameters can be user-defined or set by default. For example, an option to set dynamic parameters may be provided in a setting interface of the anti-motion sickness function provided in the setting application of the electronic device 2 for supporting the user to set target parameters, but is not limited thereto.
In some embodiments, the target parameters may include, but are not limited to, color parameters, shape, size, position, number, transparency, dynamic parameters, movement parameters, etc. of the rendered frame. In other embodiments, more or fewer parameters may be included in the target parameters.
Along with the start and stop, acceleration and deceleration, turning and the like of the vehicle 1, the electronic device 2 moves correspondingly, and then the electronic device 2 can adjust all or part of the target parameters corresponding to the moving image correspondingly, such as adjusting the color, shape, size, transparency and the like of the rendering window of the moving image, so as to perform operations of zooming in, zooming out, shifting, deforming and the like on the moving image. Thus, the dynamic effect of the moving image is made to vary with the running of the vehicle 1.
Each of the target parameters corresponding to the moving image is described below.
1. Color parameters
In some embodiments, the color of the rendered frame may be a preset color.
In other embodiments, the rendering window may be further determined according to the color of the currently displayed page of the electronic device 2, so that the rendering window is unified with the color of the current page and coordinated with each other. Further, as the color of the display page of the electronic device 2 changes, the color of the rendering frame adaptively changes. As an example, referring to the scene shown in fig. 1B, the right side of fig. 1B shows that the rendering frame corresponding to the moving image 11 is uniform with the page color, and is a gray color.
Specifically, the determination process of the color parameters of the rendering window will be described in detail in S2072 and S2072a to S2072e, which will not be described here.
2. Shape (shape parameter)
In some embodiments, the shape of the rendered window is a preset shape when the vehicle 1 begins to start. As the vehicle 1 starts and stops, accelerates and decelerates, turns, and the like, the shape of the rendering window changes accordingly.
For example, the preset shape may be rectangular, circular, elliptical, arc-shaped, crescent-shaped, etc., but is not limited thereto. As an example, the right side of fig. 1B shows that the shape of the rendering frame corresponding to the moving image 11 is rectangular. Further, when the vehicle 1 turns, the shape of the rendering window is adjusted from a rectangle to an arc.
3. Quantity of
The number of rendering forms may be one or more of a preset number, that is, one or more rendering forms are superimposed on a page displayed on the screen at the same time. As one example, the color parameters of the multiple rendering forms may be the same, but are not limited to. Referring to the scene shown in fig. 1B, the number of moving images 11 shown on the right side of fig. 1B is 2.
4. Location (location parameter)
In some embodiments, the position of the rendered frame may be a preset position.
For example, when the vehicle 1 starts to start, the position of the rendering window is a preset position. As the running state of the vehicle 1 such as start-stop, acceleration and deceleration, turning, etc., and the screen state of the electronic device 2 change, the position of the rendering window changes accordingly.
For example, the preset position may be a position in the edge region of both long sides (wide sides) or a position in the edge region of both short sides (narrow sides) of the electronic device 2, but is not limited thereto. As another example, the preset position may also be a position in a ring-shaped area of the screen edge of the electronic device 1, or the like. As an example, the position of the rendering window may refer to the position of the center point of the rendering window, and the center point of the rendering window may coincide with the center point of the edge region or the annular region described above, but is not limited thereto.
As an example, the position of the rendered forms may be related according to the screen state of the electronic device 2 and the number of rendered forms. For example, when the electronic apparatus 2 is in the portrait screen state, the position of the rendering window is the position in one or both sides of the edge regions of the long sides of the screen; when the electronic device 2 is in the landscape state, the position of the rendering window is the position in one or both sides of the edge regions of the both sides short sides of the screen. For example, the electronic device 2 in the right side of fig. 1B is in the portrait mode, and the positions of the two rendering forms are respectively within the edge regions of the long side of the screen.
In addition, in some embodiments, the position of the rendering frame is changed based on the geometric center point of the rendering frame, for example, the position of the rendering frame of a rectangle is moved based on the center point of the rectangle, such as translation or rotation, but is not limited thereto. For example, the position of the rendering frame may also be changed based on other position points such as edge points of the rendering frame, e.g., the position of the rendering frame of a rectangle may be moved based on one vertex of the rectangle.
5. Size (dimensional parameters)
In some embodiments, the size of the rendering window may be a preset size.
For example, when the vehicle 1 starts to start, the size of the rendering window is a preset size. As the vehicle 1 starts and stops, accelerates and decelerates, turns, etc., the size of the rendered window changes.
As an example, the size of the rendering window in which the moving image 11 on either side shown in fig. 1B is located is a preset size.
In some embodiments, the size of the rendering window is varied based on the center point of the rendering window, e.g., the size of a rectangular rendering window is enlarged or reduced based on the center point of the rectangle, but is not limited thereto. For example, the size of the rendering frame may also vary based on other location points such as edge points of the rendering frame, e.g., the size of a rendering frame of a rectangle may be enlarged or reduced based on one vertex of the rectangle.
6. Transparency of the film
In some embodiments, the transparency of the moving image in the rendering frame may be preset.
For example, at the start of the vehicle 1, the transparency of the rendered frame is a preset transparency. As the vehicle 1 starts and stops, accelerates and decelerates, turns, etc., the transparency of the rendered window changes.
As an example, based on an RGBA (red, green, alpha) color space model, the electronic device 2 can adjust transparency of a moving image by adjusting Alpha channels of the moving image. In the RGBA color space model, R is Red (Red), G is Green (Green), B is Blue (Blue), and a is transparency (a parameter corresponding to Alpha channel).
It will be appreciated that when the electronic device 2 superimposes the rendering window over the content displayed on the screen to display the moving image with a certain transparency, the moving image will not normally completely obscure the original content, i.e. will not affect the normal display of the original content, and the user can still see the original content.
7. Dynamic parameters
As an example, the dynamic parameter is used to indicate a dynamic effect of a moving image, such as a moire, a tree shadow, etc., and also used to indicate a frame rate of a dynamic change of the moving image, but is not limited thereto. For example, when the vehicle 1 starts to start, the dynamic parameter of the rendering frame indicates a preset dynamic effect of the moire, and the frame rate of the moving image is a preset frame rate. The dynamic parameters of the rendering window are changed along with the changes of the parameters such as speed, acceleration rate and the like in the driving process such as starting and stopping, acceleration and deceleration, turning and the like of the vehicle 1.
It will be appreciated that the dynamic parameters may be used to process the original single frame motion image to obtain a dynamically varying multi-frame motion image. It should be noted that, the different frame images of the dynamic changing moving image may be images rendered on the original single frame image according to different dynamic effects indicated by the dynamic parameters. For example, when the dynamic effect indicated by the dynamic parameter is moire, images of different frames in the dynamic moving image are rendered as moire of different forms. The dynamic parameter may include a frame rate of the moving image, and indicates how fast the moving image changes dynamically.
In some embodiments, the electronic device 2 may perform S206 at a preset period, for example, at every 10 milliseconds (ms), but is not limited thereto. As an example, the preset period of executing S206 may be dependent on the type of page content displayed by the electronic device 2. For example, video content corresponds to a larger period, while web content corresponds to a smaller period.
S207: and superposing and rendering the window display moving image on the page displayed on the screen according to the target parameters, and adjusting the target parameters according to the acquired acceleration and angular rate and the screen state of the electronic equipment 2.
In some embodiments, the electronic device 2 may perform S207 in a preset period, for example, S207 is performed every 10 milliseconds (ms), but the specific value of the preset period is not limited thereto.
Specifically, the process of adjusting the target parameter corresponding to the moving image in S207 will be described in detail in S2071 to S2073 shown in fig. 3 hereinafter, and will not be described again.
It can be understood that, during the running of the vehicle 1, along with the movement of the electronic device 2 and the screen state of the electronic device 2, the target parameters of the moving image change synchronously therewith, so as to create a scene that the moving image of the moving image seen by the human eye is consistent with the movement condition of the vehicle 1. The speed change felt by the user tends to be consistent with the speed change felt by the vestibule of the inner ear, and the motion sickness feeling of the user can be effectively relieved. In addition, when the electronic device 2 superimposes the rendering window on the page displayed on the screen to display the moving image with a certain transparency, the moving image generally does not completely block the original content on the page, that is, the normal display of the original content is not affected, and the user can still see the original content, so that the user experience of viewing the screen in the mobile scene is improved.
In addition, after the electronic apparatus starts displaying the moving image, if the acceleration does not exceed the first preset threshold, i.e., the acceleration does not meet the first condition, the display of the moving image will be canceled.
The above execution sequence of S201 to S207 is merely illustrative, and in other embodiments, other execution sequences may be adopted, partial steps may be split or combined, and some steps may be omitted, which is not limited herein. For example, in other embodiments, the above S205 may be deleted, or the above S206 and S207 may be deleted, i.e., in a manner that alleviates the motion sickness symptoms of a user browsing the screen in a mobile scene.
Next, a specific process of the above-described change of the target parameter of the moving image with the running of the vehicle 1 in S207, which may include the following S207a to S207c, will be described with reference to the method shown in fig. 3, with the execution subject still being the electronic apparatus 2.
S207a: and determining the position in the target parameters of the rendering window according to the screen state.
As an example, referring to fig. 1B, the electronic device 2 may determine that the screen state is a vertical screen state, and determine that the position in the target parameter of the rendering window is the two long edges of the screen (i.e., the preset position corresponding to the vertical screen state). It is understood that the number of target parameters of the rendering frame may be a preset number, such as 2.
Similarly, the electronic device 2 may determine that the screen state is a horizontal screen state, and determine that the positions in the target parameter of the rendering window are edges of two short sides of the screen (i.e. preset positions corresponding to the horizontal screen state), and the number of the edges is 2.
It is understood that the electronic device 2 can adjust the position and number of rendering frames in real time as the screen state of the electronic device 2 changes during the running of the vehicle 1. For example, before time t1, the screen state of the electronic device 2 is in a vertical screen state, as shown in fig. 1B, which is a page and a rendering window displayed by the electronic device 2, at time t1, the user rotates the electronic device 2, and the page and the rendering window of the electronic device 2 are displayed in a horizontal screen state, and mainly, the position of the rendering window is adjusted, so that the display of the moving image is prevented from affecting the normal display of the content on the current page as much as possible.
S207b: and determining the color parameters in the target parameters of the rendering window according to the color parameters of the current page.
In some embodiments, the color parameter of the current page displayed by the electronic device 2 may be a color parameter of a page picture of the current page.
It can be understood that, as the content of the page displayed by the electronic device 2 changes, when the color of the page changes, the color parameters of the rendering window can be adaptively adjusted, so that the color of the rendering window displayed on the page in a superimposed manner changes adaptively.
Specifically, the determination process of the rendering parameters of the rendering window will be described in detail in S2072a to S2072e hereinafter, and will not be described here again.
S207c: other parameters, such as at least one of a size (or a dimension parameter), transparency, shape (or a shape parameter), position (or a position parameter), and dynamic parameters, among the target parameters of the rendering frame are adjusted according to the acceleration or the angular rate, so that the moving image in the rendering frame dynamically changes with the movement of the electronic device 2, that is, the moving image in the rendering frame dynamically changes with the movement of the vehicle 1.
In some embodiments, the electronic device 2 may periodically detect the state of the electronic device 2, and when the vector sum of the axial accelerations detected by the accelerometer in the electronic device 2 is about the gravitational acceleration (g), if the axial accelerations of the accelerometer in the electronic device 2 do not change much in a plurality of period detections, which indicates that the user holds the electronic device 2 or fixes the electronic device 2 in the vehicle 1, the acceleration change condition of the electronic device 2 may be obtained when the electronic device 2 is accelerated or decelerated, that is, the acceleration change condition corresponding to the vehicle 1 is obtained.
Firstly, after the vehicle 1 is started, the electronic device 2 determines that the acceleration and the angular rate corresponding to the vehicle 1 gradually increase from 0 until the acceleration is greater than the corresponding first preset threshold, and may determine that the parameters in the target parameters are a preset moving parameter, a preset size, a preset shape, a preset dynamic parameter, a preset transparency and a preset position. Then, during the running of the vehicle 1, the electronic device 2 determines the change conditions of the acceleration and the angular rate corresponding to the vehicle 1, and adjusts the parameters in the target parameters of the rendering window according to the change conditions.
In some embodiments, if the vehicle 1 is traveling at a constant speed, i.e., the electronic device 2 is moving at a constant speed, the electronic device 2 may adjust the transparency in the target parameters such that the image in the rendered frame is completely transparent and invisible to the user; concealing the moving image; alternatively, the display of the moving image is canceled. When the vehicle 1 runs at a constant speed, the acceleration obtained by the electronic device 2 is 0, and the possibility of motion sickness of the user is low. It will be appreciated that in practical applications, the acceleration obtained by the electronic device 2 is a small value tending to 0 within a period of time (e.g. 3 minutes), and the vehicle 1 in which the electronic device 2 is located may be considered to be in a state of constant speed running. The smaller value may be set according to practical situations, and is not specifically shown.
In some embodiments, as the electronic device 2 accelerates with the vehicle 1, the electronic device 2 may adjust the size of the moving image in the rendered frame from large to small, simulating effects away from the user. When the electronic apparatus 2 decelerates with the vehicle 1, the electronic apparatus 2 can adjust the size of the moving image in the rendering window from small to large, simulating the effect of approaching the user.
According to some embodiments of the application, the size of the moving image in the rendered frame varies with the acceleration acquired by the electronic device 2. For example, the acceleration detected by the current sampling point is set to be ai, the preset acceleration is set to be as, and the adjustment formula of the size proportionality coefficient can be x=1-ai/as. That is, the size m1=before-adjustment size m0×size scaling factor X after moving image adjustment.
When the electronic device 2 accelerates with the vehicle 1, ai is a positive value, and the larger the amplitude of the acceleration ai acquired by the electronic device 2, the smaller the size of the moving image adjusted according to the size proportion formula. As an example, the amplitude of ai becomes small from large to 0 during acceleration of the electronic apparatus 2 with the vehicle 1, so that the size scale factor X becomes large from small to 1, and the size of the moving image is reduced first and then gradually enlarged to become a normal size (i.e., a preset size).
When the electronic apparatus 2 decelerates with the vehicle 1, ai takes a negative value, and the larger the magnitude of the acceleration ai acquired by the electronic apparatus 2, the larger the size of the moving image adjusted in accordance with the size proportion formula. As an example, the magnitude of ai also becomes smaller from large to 0 during deceleration of the electronic apparatus 2 with the vehicle 1, so that the size scale factor X becomes smaller from large to 1, and the size of the moving image is enlarged first and then gradually reduced to the normal size (i.e., the preset size).
The value of as can be set according to the actual requirement, and is not limited.
In addition, in other embodiments, for a dynamically changing multi-frame moving image, when the electronic device 2 accelerates and decelerates along with the vehicle 1, the acceleration detected by the electronic device 2 through the accelerometer changes along with the acceleration, and the electronic device 2 may mix the current frame and the previous frame of the moving image according to a preset weight, so as to simulate the residual visual effect formed by acceleration and deceleration. The value of the preset weight can be set according to actual requirements, and is not particularly limited.
As an example, based on the scenario shown in fig. 1A, referring to fig. 4A, a schematic diagram of display contents of an electronic device in a moving scenario in which a vehicle accelerates is shown. The electronic apparatus 2 displays the two moving images 11 as a superimposed rendering frame of the screen-displayed page 26a of the electronic apparatus 2 shown on the left side of fig. 4A before the acceleration of the vehicle 1. As the electronic apparatus 2 becomes smaller in size in the page 26b as the electronic apparatus 2 adjusts the two moving images 11 in the rendering window as shown on the right side of fig. 4A as the vehicle 1 accelerates, the effect far from the user is simulated so that the perception of the speed by the human eye and the vestibule of the inner ear tends to coincide.
As another example, based on the scenario shown in fig. 1A, referring to fig. 4B, a schematic diagram of display contents of an electronic device in a moving scenario in which a vehicle is decelerating is shown. The electronic apparatus 2 displays the two moving images 11 as a superimposed rendering frame of the screen-displayed page 27a of the electronic apparatus 2 shown on the left side of fig. 4B before the vehicle 1 decelerates. As the electronic apparatus 2 increases in size of the two moving images 11 in the rendering window on the page 27B as shown on the right side of fig. 4B as the vehicle 1 decelerates, the electronic apparatus 2 simulates the effect of approaching the user so that the perception of speed by the human eye and the vestibule of the inner ear tends to coincide.
In some embodiments, when the electronic device 2 turns with the vehicle 1, the electronic device 2 may adjust the shape of the moving image in the rendering window to simulate the effect of turning, by converting the rendering window into an arc shape, and by mixing the front and rear frames of the moving image, simulate the effect of rotating in a certain direction. As an example, the moving image may be rotated based on the center point of the screen of the electronic device 2. It should be noted that, the moving image after rotation may have a partial image out of the display range of the screen.
In some embodiments, the degree of change in the shape of the rendered frame is also related to the acceleration and angular velocity acquired by the electronic device 2, e.g., a rectangular rendering frame transformed into an arc-shaped curved direction is related to the acquired acceleration direction, and the angle at which the rendering frame is curved is related to the acquired angular velocity.
Furthermore, in some embodiments, the horizontal placement or vertical placement of the electronic device 2 will affect the direction of the motion image's rendering frame rotation. For example, the moving image will rotate along the Z axis of the coordinate axis when the electronic device 2 is in the landscape state and placed horizontally as shown on the left side of fig. 5A, and will rotate along the X axis of the coordinate axis when the electronic device 2 is in the portrait state and placed vertically as shown on the right side of fig. 5A, which may be a coordinate axis set based on the geometric center of the electronic device 2.
As an example, based on the scene shown in fig. 1A, referring to fig. 5B, a schematic diagram of display contents of an electronic device in a moving scene of a left turn of a vehicle is shown. The electronic apparatus 2 renders two moving images 11 in a rectangular shape in window display with the superposition of the screen-displayed page 28a of the electronic apparatus 2 shown on the left side of fig. 5B before the vehicle 1 turns left, and the screen is placed vertically in a vertical screen state. The electronic apparatus 2 adjusts the two moving images 11 to rotate around the X-axis on the page 28B as shown in the right side of fig. 5B as the electronic apparatus 2 adjusts the vehicle 1 turns left so that the shape changes from rectangular to curved to right, simulating the effect of the vehicle 1 turning left.
As an example, referring to fig. 5C, a schematic diagram of display contents of an electronic device in a moving scene of a right turn of a vehicle is shown based on the scene shown in fig. 1A. The electronic apparatus 2 renders two moving images 11 in a rectangular shape in window display with the superposition of the screen-displayed page 29a of the electronic apparatus 2 shown on the left side of fig. 5C before the vehicle 1 turns right, and the screen is placed vertically in a vertical screen state. The electronic apparatus 2 adjusts the two moving images 11 to rotate around the X-axis on the page 29b as shown in the right side of fig. 5C as the electronic apparatus 2 adjusts the vehicle 1 turns right so that the shape changes from rectangular to curved to left, simulating the effect of turning the vehicle 1 right.
According to some embodiments of the application, the shape of the rendered frame varies with the transformation of the angular rate acquired by the electronic device 2. For example, the angular rate obtained by the electronic device obtained by detecting the current sampling point is set to be bi, the preset angular rate is set to be bs, and the arc-shaped curvature adjustment formula can be x=1-bi/bs. The value of bs may be set according to practical applications, and is not limited thereto.
As an example, for a dynamically changing multi-frame moving image, when the electronic device 2 turns along with the vehicle 1, the angular rate acquired by the electronic device 2 changes, and the electronic device 2 may mix the current frame of the moving image with the previous frame according to another preset weight to obtain a mixed current frame, display the mixed current frame, and simulate the afterimage visual effect formed by acceleration and deceleration. The value of the preset weight can be set according to practical application, and is not particularly limited. For example, the preset weight refers to that the current frame is weighted 90% and the previous frame is weighted 10%, and the image content of one frame is superimposed on one side of the current frame.
In other embodiments, when the electronic device 2 accelerates and turns with the vehicle 1, the electronic device 2 may adjust the size and shape of the rendering window at the same time, and the specific implementation may refer to the above-mentioned example of adjusting the rendering window related to turning and acceleration and deceleration, which is not limited in detail.
In some embodiments, as the electronic device 2 accelerates and decelerates with the vehicle 1, the electronic device 2 also adjusts the transparency of the rendered frame such that the transparency of the moving image becomes lower (i.e., more opaque) with the speed of the vehicle 1 and higher (i.e., more transparent) with the speed of the vehicle 1.
In some embodiments, as the electronic device 2 accelerates and decelerates with the vehicle 1, the electronic device 2 also adjusts the dynamic parameters of the rendered frame such that the motion image moves faster (e.g., increases in frame rate) as the magnitude of the acceleration of the vehicle 1 increases and moves slower (e.g., decreases in frame rate) as the magnitude of the acceleration of the vehicle 1 decreases.
It will be appreciated that the above manner in which the target parameters of the rendering frame change with the acceleration and angular rate acquired by the electronic device 2 is merely an example, and other realizable manners may also be adopted, which is not specifically limited thereto.
Next, a description will be made of a process of generating color parameters among adaptive target parameters in S207b in the present application, the execution subject still being the electronic device 2, with reference to the method shown in fig. 6, which may include the following steps S207b1 to S207b5:
s207b1: and acquiring a page picture currently displayed on the screen.
As one example, the electronic device 2 may perform a screen capturing operation on a page displayed on the current screen to obtain a picture of the page currently displayed.
S207b2: and extracting the background color in the page picture.
The background color is the color with large occupied area in the page picture, and is generally the colorless phase color of black and white gray.
It is understood that hue is the primary characteristic of color, colors other than black and white gray are all hued attributes, and corresponding black and white gray is not hued attributes. For example, the basic hue is: red, orange, yellow, green, blue, violet.
For example, the electronic device 2 may extract the background color by using a mathematical statistic, a K-Means clustering algorithm (K-Means clustering algorithm, K-Means), a support vector machine (Support Vector Machine, SVM), or the like, but is not limited thereto.
In some embodiments, the electronic device 2 may cluster RGB values of the page picture pixels, and use the category with the largest occurrence number as the background color. As an example, the value of the cluster center point may be selected as the value of the background color, which is an RGB value. It is understood that RGB values refer to color values in RGB color mode, including values for three color channels of red (R), green (G), and blue (B).
S207b3: and extracting the theme colors in the page pictures.
The proportion of the theme color is smaller than that of the background color, namely the theme color is the color with the largest occupied area except the background color in the page picture.
If the background color is colored, the background color may be equivalently the subject color, i.e., the electronic device 2 determines the background color as the subject color.
If the background color is not hue, the electronic device 2 may acquire RGB values of RGB format (red, green, blue) of the theme color by using a median division method, a color modeling method, a clustering method, and the like for the current page picture.
S207b4: and extracting auxiliary colors in the page pictures.
In some embodiments, for each pixel in the page picture, the colors of the pixel points with the occurrence times smaller than the theme colors are used as auxiliary colors in order from more to less. It is understood that the auxiliary color is the color with the largest occupied area except the background color and the theme color in the page picture. Taking the above page as a webpage as an example, the visual importance and volume of the auxiliary color are inferior to the main color and the background color, and are often used for accompanying the problem color, so that the main color is increased and highlighted. In a web page, typically smaller elements, such as buttons, icons, etc.
S207b5: at least one of a background color, a theme color, and an auxiliary color of the page picture is adjusted to generate color parameters of the rendering frame.
In some embodiments, in the case where the moving image is a preset image or a user-defined image or an image taken from a page picture of a current page, the background color in the moving image may be replaced with the background color of the current page, the subject color of the moving image may be replaced with the subject color of the current page, and the auxiliary color of the moving image may be replaced with the auxiliary color of the current page.
In addition, in other embodiments, the moving image is a solid image generated from any one of a background color, a theme color, and an auxiliary color in a color parameter in a page image in a current page.
In some embodiments, when the background color of the current page picture is different from the theme color, the electronic device 2 converts the color parameter of the background color of the page picture from RGB values in RGB format to HSV (Hue, H), saturation (S), brightness (Value, V)) format HSV values, adjusts the brightness and the brightness to generate the background color of the rendering window, and replaces the background color of the moving image with the background color. For example, an HSV value obtained by increasing the brightness and the brightness of the page picture by a preset value is used as an HSV value of the background color or the theme color of the rendering window, and the HSV value of the background color of the moving image in the rendering window is obtained and adjusted to the HSV value. In addition, the electronic device 2 can adjust the saturation of the theme color of the page picture, so that the theme color of the rendering window is the same color system as the theme color of the page picture, and the interface colors are more uniform. As an example, the difference between the saturation of two colors in the same color system is less than or equal to a preset difference, and the value of the preset difference may be set according to the requirement, which is not particularly limited.
In addition, in other embodiments, when the background color of the current page picture is the same as the theme color, the electronic device 2 may convert the color parameters of the theme color of the page picture from RGB values to HSV values, and adjust the brightness and the lightness to generate the background color and the theme color of the rendering window, that is, the theme color and the background color of the rendering window are the same.
In addition, in other embodiments, the electronic device 2 may further adjust the brightness, brightness or saturation of the auxiliary color in the current page picture, and generate the auxiliary color of the rendering window, so that the auxiliary color of the page and the auxiliary color of the rendering window are in the same color system, and the colors of the rendering window and the current page are more uniform.
It is understood that when the electronic device 2 can update the page displayed on the screen, the electronic device 2 can re-execute the above steps S207b1 to S207b5 to update the color parameters of the moving image, thereby realizing the adaptive display of the moving image according to the page.
Next, a method of automatically turning on the motion sickness prevention function in S201 in the embodiment shown in fig. 2 will be mainly described in detail. Referring to fig. 7, a flowchart of a method for starting a motion sickness preventing function is shown, wherein an execution subject is an electronic device 2, and the steps S201a to S201e include the following steps S201a to S201e:
S201a: the status of the electronic device 2 is queried.
For example, the state of the electronic device 2 may include a screen state, a gesture, and the like.
In some embodiments, a flag bit for identifying the screen state is saved in the electronic device 2. For example, a first flag bit value of 1 indicates that the electronic device 2 is in a bright screen state, and a value of 0 indicates that the electronic device 2 is in an off screen state. In addition, the posture of the electronic device 2 can be represented by the rotation angle of the electronic device 2.
S201b: and judging whether the screen state is a bright screen state or not. If yes, go to S201c; if not, the anti-motion sickness function is not started in S201 e.
Typically the electronic device 2 is in a bright screen state indicating that the user is using the electronic device, such as browsing content displayed on a screen.
In some embodiments, when the user automatically or manually establishes a wireless communication connection with the vehicle 1 after getting on, the electronic device 2 may start executing S201b to determine whether the anti-motion sickness function needs to be turned on.
S201c: it is determined whether the posture of the electronic device 2 is a preset posture. If yes, go to S201d, if not, go to S201e without turning on the motion sickness prevention function.
It can be understood that, generally, when a user holds the electronic device 2 and browses after sitting on a parking space, the posture of the electronic device 2 is relatively fixed (i.e., the preset posture) for a period of time, and at this time, the pitch angle and the roll angle of the electronic device 2 are within a certain range (recorded as a preset angle range). Then in S201c, it is determined that the rotation angle is within the preset angle range, which indicates that the gesture of the electronic device 2 is relatively fixed for a period of time, and the user is generally considered to be in a state of browsing the screen.
In some embodiments, the electronic device 2 may detect the rotation angle of the electronic device 2, which may include a pitch angle and/or a roll angle, through a gyroscope or accelerometer or the like. Wherein the rotation angle includes a pitch angle and a roll angle, and the preset angle range includes an angle range corresponding to the pitch angle and an angle range corresponding to the roll angle. The embodiment of the application does not limit the specific value of the preset angle range, and can be selected according to practical application.
S201d: based on the position and the speed of the electronic apparatus 2, it is determined whether or not it is in the riding state (i.e., whether or not the position and the speed satisfy the third condition). If yes, go to S201d, if not, go to S201e without turning on the motion sickness prevention function.
In some embodiments, the electronic device 2 may detect the location and speed at which the electronic device 2 is located through a positioning chip employing GPS technology. In general, the electronic apparatus 2 has a large position change and a large speed change in a short time, which means that the user is in a driving state, that is, the vehicle 1 is in a running state. For example, the electronic device 2 determines that the amount of change in the position of the electronic device 2 is greater than or equal to the preset distance and the amount of change in the speed is greater than or equal to the preset speed threshold (i.e., the third condition) within a continuous period of time (e.g., 3 minutes), which indicates that the user is in the driving state; otherwise, it indicates that the user is not in driving state. As an example, the electronic device 2 may establish a wireless communication connection with the vehicle 1, and when the electronic device 2 detects that the electronic device 2 is located at the same position as the vehicle 1, it may determine that the electronic device 2 is in a riding state by determining that the electronic device 2 is in a position change and a speed change.
In other embodiments, the electronic device 2 may automatically or manually establish a connection with the vehicle 1 when the user sits on the vehicle. The electronic device 2 may acquire the speed and position of the vehicle 1, the speed and position of the vehicle 1 being acquired from a GPS chip in the vehicle 1. Thus, the electronic apparatus 2 determines whether the vehicle 1 is in a traveling state based on the acquired speed and position, and further determines whether the user of the handheld electronic apparatus 2 is in a riding state. For example, the method of determining whether the vehicle 1 is in a running state based on the position and the speed of the vehicle 1 may refer to the above-described determination method of the position and the speed detected by the electronic apparatus 2, but is not limited thereto.
S201e: automatically opening the anti-carsickness function.
S201f: the anti-motion sickness function is kept closed.
It will be appreciated that after the electronic device 2 turns on the anti-motion sickness function, the above-described mobile display method may begin to be performed to alleviate or eliminate the symptoms of motion sickness of the user.
In some embodiments, when the anti-motion sickness function is started, the electronic device 2 may prompt the user that the anti-motion sickness function is started currently by displaying a pop-up window message or voice, for example, as shown in fig. 8, the electronic device 2 displays start reminding information 81, such as "start anti-motion sickness function for you, ensure driving experience for you" in the pop-up window on the screen. The pop-up message 81 may be displayed for a preset period of time (e.g., 2 seconds) and then automatically cancel the display.
In other embodiments, the user is supported to manually activate the anti-motion sickness function before the electronic device 2 automatically activates the anti-motion sickness function. As an example, the electronic device 2 may provide a setting option for the anti-motion sickness function in a setting application or an application icon for the anti-motion sickness function on a desktop, which the user supports to manually turn on the anti-motion sickness function. For example, the anti-motion sickness function may be manually turned on by the setting option before or while the user gets on the car.
In addition, in other embodiments, the electronic device 2 may also guide the user to manually turn on the motion sickness prevention function. For example, when the electronic device 2 establishes a connection with the vehicle 1 after the user gets on the vehicle, the electronic device 2 may prompt the user whether to select to turn on the motion sickness preventing function by means of a popup message or voice, so as to alleviate or eliminate motion sickness symptoms. As shown in fig. 9, an exemplary embodiment of the present application provides a prompt interface for opening a motion sickness function displayed on an electronic device 2. For example, the interface includes a selection of on-alert information 91 such as "detect you enter into the car, whether to turn on anti-motion sickness", and on button 92 and cancel button 93. After the user clicks the on button 92 to determine to turn on, the electronic device 2 may turn on the anti-motion sickness function. Of course, after the user clicks the cancel button 93, the electronic device 2 will not turn on the motion sickness prevention function.
Example two
A flow diagram of a mobile display method is shown below with reference to fig. 10 based on the application scenario shown in fig. 1A. The embodiment shown in fig. 10 is mainly different from the embodiment shown in fig. 2 in that the embodiment shown in fig. 10 is to replace S205 in fig. 2 with S1005a and S1005b. And S1005a and S1005b are similar to S206 and S207, respectively, shown in fig. 2, with the main difference that the moving image displayed on the screen by the electronic apparatus 2 is replaced with a three-dimensional animation such as a moving doll. At this time, the electronic apparatus 2 can cause the human eye to perceive a change in speed by displaying a dynamic motion doll when the vehicle 1 is decelerating and the pitch fluctuation is large. Specifically, the execution subject of the method illustrated in fig. 10 is still the electronic apparatus 2, and the method specifically includes the following steps S1001 to S1007.
In general, the stronger the bump or the larger the heave of the vehicle 1, the less suitable the user to browse the screen. Therefore, when the vehicle 1 is slightly fluctuated, the moving image can be displayed on both side edges of the screen of the electronic device 2, and the influence on the page content normally displayed by the electronic device can be reduced as much as possible. The movement doll provided by the application is not limited to display at the edge of a screen, and can be displayed at the center of the screen, and the like, so that the influence on the normal display of page contents on the electronic equipment 2 is larger, such as more contents which are shielded from the normal display. Therefore, when the vehicle 1 fluctuates greatly and is not suitable for the user to browse the content normally displayed on the screen, the motion doll can be superimposed on the page displayed on the screen, so as to increase the interestingness of preventing carsickness.
S1001: and starting the anti-carsickness function.
S1002: the acceleration and the angular velocity are acquired to judge the movement state of the electronic apparatus 2 with the vehicle 1.
S1003: judging whether the acquired acceleration and/or angular rate exceeds a corresponding first preset threshold value. If yes, go to S1004, if not, go back to S1002.
S1004: and judging whether the acquired acceleration and/or acceleration rate meets a second condition. If yes, S1005a is entered, and if not, S1006 is entered.
The second condition indicates that the electronic device 2 is greatly swayed, and the vehicle 1 where the corresponding electronic device 2 is located is bumpy or fluctuated greatly.
The specific descriptions of S1001 to S1004 are the same as S201 to S204 in the embodiment shown in fig. 2, and are not repeated here. Accordingly, when yes is determined in S1004, the process proceeds to S1005a, and when no is determined, the process proceeds to S1006.
S1005a: target parameters corresponding to the moving puppet are determined.
Similarly, the description of the target parameter corresponding to the moving doll may refer to the related description of the target parameter corresponding to the moving image in S206 above.
It is to be understood that the sports figures in S1005a may be replaced by any three-dimensional animation, such as football, puppy, etc.
In some embodiments, the moving doll may be an animation formed by multiple frames of images. For example, referring to fig. 1C, the sports figures may be skiing figures, and the animation of the sports figures may be skiing movements.
It will be appreciated that the electronic device 2 may superimpose the rendered frame display motion figures on the page of the screen display.
In some embodiments, the target parameter corresponding to the moving doll may refer to the target parameter corresponding to the moving image in S206 described above. That is, the target parameters corresponding to the motion doll may include color parameters, shape, number, position, size, transparency, dynamic parameters.
As an example, in the process that the electronic device 2 moves along with the running of the vehicle 1, the color parameter, the shape, the number, the transparency and the dynamic parameter in the target parameters corresponding to the moving doll may be kept unchanged, and all the parameters are preset parameters of the moving doll object, so as to ensure the display effect of the moving doll, but not limited thereto. In other examples, these parameters of the target parameters corresponding to the movement figures may also vary with the acceleration and/or angular rate acquired by the electronic device 2.
As an example, during movement of the electronic device 2 as the vehicle 1 travels, the position and size in the target parameters corresponding to the moving figures may vary with the acceleration and/or angular rate of the vehicle 1.
Further, in some embodiments, the target parameters corresponding to the moving figures include not only the above examples, but also movement parameters, which may include acceleration, angular rate, and the like. As an example, the moving finger of the sports doll moves the whole sports doll with reference to the center point of the sports doll, but is not limited thereto. The movement mode of the sports doll can comprise translation, rotation and the like.
For example, when the vehicle 1 is started and the acceleration is greater than the corresponding first preset threshold, the movement parameter of the rendering window is a preset movement parameter, where the acceleration and the angular rate in the movement parameter are both 0, which indicates that the vehicle 1 is not started currently or the acceleration is small. As the vehicle 1 starts and stops, accelerates and decelerates, turns, etc., the dynamic parameters of the rendered frame change.
S1005b: and superposing and rendering the window on the page displayed on the screen according to the target parameters corresponding to the motion doll to display the motion doll, and adjusting the target parameters according to the acquired acceleration and angular rate.
Similarly, the adjustment configuration of the target parameter corresponding to the moving doll may refer to the description of the adjustment process of the target parameter corresponding to the moving image in S207 above, and the same points are not repeated here. The difference is that in some embodiments, the electronic device 2 may also adjust the movement parameters in the target parameters corresponding to the movement figures.
For example, the acceleration in the movement parameter is obtained by multiplying the acquired acceleration a1 by a preset coefficient c1, and the directions of the acceleration and the preset coefficient c1 are consistent. The angular rate in the movement parameter is obtained by multiplying the corresponding angular rate b1 of the vehicle 1 by a preset coefficient c 2. The values of the preset coefficient c1 and the preset coefficient c2 may be set according to actual requirements, which is not specifically limited.
It will be appreciated that typically, when the electronic device 2 displays a sports figure, the rider will look at the sports figure. In addition, the screen of the electronic device 2 is generally small in size, and the rider can feel a change in the speed of the sports figure without staring at the sports figure as long as he/she looks at the screen.
Thus, when the electronic device 2 moves along with the start and stop, acceleration and deceleration, turning and the like of the vehicle 1, the display effect of the movement doll dynamically changes, and the movement effects of acceleration and deceleration, turning, jolt and the like of the movement doll can be reflected. The human eyes can feel the speed change on the screen, and the speed change is synchronous with the change of the vehicle 1 sensed by the vestibule of the inner ear, so that the motion sickness symptom of the user when the user browses the screen in a moving scene can be reduced or eliminated.
In some embodiments, the above-mentioned sports figures are draggable, and the user can drag the sports figures to move randomly on the screen, so as to promote interest and facilitate alleviation of motion sickness symptoms of the user.
In some embodiments, the above-described kinematic puppets may be associated with augmented reality (Augmented Reality, AR) applications. For example, the user may perform a long-press operation on the sports doll displayed on the screen, and trigger the electronic device 2 to switch the currently displayed page to the AR interface corresponding to the sports doll. As an example, a live action and a sports figure in the current car are displayed on the screen of the electronic device 2, and the sports figure is dynamically displayed according to the acceleration and angular rate of the vehicle 1 to alleviate or eliminate the motion sickness symptoms of the user.
S1006 and S1007, wherein the specific descriptions of S1006 and S1007 are the same as S206 and S207, respectively, in the embodiment shown in fig. 2, and are not repeated here.
The above execution sequence of S1001 to S1007 is merely illustrative, and in other embodiments, other execution sequences may be adopted, partial steps may be split or combined, and some steps may be omitted, which is not limited herein. For example, in other embodiments, the above S1005a and S1005b may be deleted, or the above S1006 and S1007 may be deleted, i.e., the motion sickness symptom of the user browsing the screen in the mobile scene may be alleviated in a way.
Example III
A flow chart of a mobile display method is shown below with reference to fig. 11 based on the application scenario shown in fig. 1A. The embodiment shown in fig. 11 is mainly different from the embodiment shown in fig. 2 in that the embodiment shown in fig. 11 is a step in which S205 in fig. 2 is replaced with S1105a and S1105b, and in addition, two steps S206 and S207 are replaced with S21106. Whereas S1105a and S1105b are similar to S206 and S207, respectively, shown in fig. 2, the main difference being that the moving image displayed on the screen by the electronic device 2 is replaced with a vehicle three-dimensional animation. At this time, the electronic apparatus 2 can cause the human eye to perceive a change in speed by displaying the three-dimensional animation of the vehicle when the vehicle 1 is decelerating and the pitch fluctuation is large. Specifically, the execution subject of the method shown in fig. 11 is still the electronic device 2, and the method specifically includes the following steps:
s1101: and starting the anti-carsickness function.
S1102: the acceleration and the angular velocity are acquired to judge the movement state of the electronic apparatus 2 along with the movement of the vehicle 1.
S1103: it is determined whether the corresponding acceleration and/or angular rate of the vehicle 1 exceeds a corresponding first preset threshold. If yes, go to S1004, if not, go back to S1102.
The specific descriptions of S1101 to S1103 are the same as S201 to S203 in the embodiment shown in fig. 2, respectively, and will not be repeated here.
S1104: it is determined whether the acquired acceleration and/or angular rate satisfies a second condition.
The second condition indicates that the electronic device 2 is greatly swayed, and the vehicle 1 where the corresponding electronic device 2 is located is bumpy or fluctuated greatly.
Accordingly, if the determination is yes, it is indicated that the vehicle 1 is bumpy or rough, and the process proceeds to S1105a; if not, it is indicated that the vehicle 1 is not bumpy or has small heave, the routine proceeds to S1106 where normal page contents are displayed.
In addition, the method for determining whether the bump or heave occurs in the vehicle 1 may refer to the description related to S204 shown in fig. 2 above, and will not be repeated here.
It will be appreciated that the electronic device 2 normally displays the page content before executing S205 and S206, without affecting the normal browsing of the page content by the user in the mobile scenario.
Referring to fig. 12, a flow related to three-dimensional animation of a vehicle is displayed for the electronic device 2. As shown in fig. 12, the electronic device 2 may obtain an acceleration corresponding to the vehicle 1 through the accelerometer 121, an angular velocity corresponding to the vehicle 1 through the gyroscope 122, direction information corresponding to the vehicle 1 through the magnetometer 123, and position information and road network information corresponding to the vehicle 1 through the positioning chip 124. The road network refers to traffic roads, traffic junctions and traffic networks which are formed by a plurality of roads such as main roads, auxiliary roads, branches, turnout roads and the like. And the road network restricts the moving track of the moving object such as the vehicle 1.
The device may be a device in the vehicle 1 or a device in the electronic apparatus 2. Further, the electronic device 2 can obtain the navigation route 125 corresponding to the vehicle 1 based on the information. Further, it is judged whether or not there is acceleration or deceleration or jerk of the vehicle 1 (126), and if it is judged that acceleration or deceleration, jerk or jerk of the vehicle 1 is large, a three-dimensional animation of the vehicle is superimposed on the page content displayed on the screen according to the navigation route 125 corresponding to the vehicle 1 (128); conversely, if no acceleration or deceleration, no jerk, or little heave of the vehicle 1 occurs, normal page content is displayed on the screen (127).
It can be understood that the electronic device 2 obtains the position information, and obtains the position, the speed, and other information of the vehicle 1, and when the acceleration and deceleration movement, the jolt, and the like of the vehicle 1 are unfavorable for the user to browse the electronic device 2, the electronic device 2 can render the three-dimensional animation of the vehicle on the screen for displaying, simulate the movement condition of the vehicle, and relieve the motion sickness symptoms of the user.
In some embodiments, the electronic device 2 may acquire the pitch and roll angles of the coordinate system of the electronic device 2 and the navigational coordinate system while the vehicle 1 is at a constant speed or stationary. The corresponding heading information of the vehicle 1 is acquired through the magnetometer 12 or road network information, and the relative attitude, speed and position of the vehicle 1 at any moment compared with the last sampling moment are acquired according to a navigation resolving or deep learning mode. And the initial position and initial speed of the vehicle 1 may be provided by the positioning chip 124, or the position and speed of the vehicle 1 at this time may be taken as the initial position and initial speed, respectively, at the zero point when the vehicle 1 is stationary. The inertial device data such as the accelerometer 121 and the gyroscope 122 are not affected by electromagnetic interference, urban canyons and tunnels, and the information provided by the positioning chip 124 can restrict the information such as the speed, acceleration, position, angular velocity and the like corresponding to the vehicle 1 calculated according to the inertial device, so that the motion of the vehicle three-dimensional animation of the subsequent simulated vehicle 1 motion is more accurate.
S1105a: and obtaining target parameters corresponding to the three-dimensional animation of the vehicle.
As an example, the three-dimensional animation of a vehicle is a dynamic effect of the vehicle traveling on a road, and may include, but is not limited to, a three-dimensional vehicle and a road background where the three-dimensional vehicle is located. In the dynamically displayed three-dimensional animation of the vehicle, the vehicle 1 and the road background can move simultaneously to create a picture effect that the three-dimensional vehicle runs forward and the road background moves backward, so that a scene of actual running of the vehicle 1 is simulated. As an example, the road background in the three-dimensional animation of the vehicle may be an image of the navigation route of the current vehicle 1.
It will be appreciated that in some embodiments, the three-dimensional animation of the vehicle in S1105a described above may be replaced with other three-dimensional animations, such as an airplane or boat. In addition, in other embodiments, the three-dimensional animation of the vehicle may be replaced by a real-time running screen of the vehicle 1, which is not particularly limited.
In some embodiments, the three-dimensional animation of the vehicle may be an animation composed of multiple frames of images. For example, referring to FIG. 1F, a three-dimensional animation of a vehicle includes a three-dimensional automobile and a road background.
It will be appreciated that the electronic device 2 may superimpose the rendered frame on the page of the screen display to display the vehicle three-dimensional animation.
In some embodiments, the target parameter corresponding to the three-dimensional animation of the vehicle may refer to the target parameter corresponding to the moving image in S206 described above. That is, the target parameters corresponding to the three-dimensional animation of the vehicle may also include color parameters, shape, number, position, size, transparency, dynamic parameters, and the like.
As an example, during the running of the vehicle 1, the color parameters, shape, number, transparency and dynamic parameters in the target parameters corresponding to the three-dimensional animation of the vehicle may be kept unchanged, which are all preset parameters, so as to ensure the effect of displaying the three-dimensional animation of the vehicle. In other examples, parameters in the target parameters corresponding to the three-dimensional animation of the vehicle may also vary with the acceleration and/or angular rate corresponding to the vehicle 1.
As an example, the position and the size in the target parameters corresponding to the three-dimensional animation of the vehicle may vary with the acceleration and/or the angular rate of the vehicle 1 during the running of the vehicle 1.
Further, in some embodiments, the target parameters corresponding to the three-dimensional animation of the vehicle include not only the examples described above, but also movement parameters. For example, the movement parameters may include acceleration, angular rate, and the like. As an example, the movement of the vehicle three-dimensional animation refers to moving the entire vehicle three-dimensional animation with reference to the center point of the vehicle three-dimensional animation, but is not limited thereto. The moving mode of the three-dimensional animation of the vehicle can comprise translation, rotation and the like.
For example, when the vehicle 1 is started and the acceleration is greater than the corresponding first preset threshold, but the second condition is not satisfied, the movement parameter of the rendering frame is a preset movement parameter, wherein the acceleration and the angular rate in the movement parameter are both 0. As the vehicle 1 starts and stops, accelerates and decelerates, turns, etc., the dynamic parameters of the rendered frame change.
Thus, with the dynamic change of the display effects of the vehicle 1 such as start-stop, acceleration and deceleration, turning, etc., the effects of acceleration and deceleration, turning, and pitching, etc., of the three-dimensional vehicle in the three-dimensional animation of the vehicle can be reflected. The human eyes can feel the speed change on the screen, and the speed change is synchronous with the change of the vehicle 1 sensed by the vestibule of the inner ear, so that the motion sickness symptom of the user when the user browses the screen in a moving scene can be reduced or eliminated.
S1105b: and according to the target parameters corresponding to the vehicle three-dimensional animation, superposing and rendering a window on a screen display page to display the vehicle three-dimensional animation, and adjusting dynamic display parameters according to a navigation route corresponding to the vehicle 1, wherein the navigation route is determined according to acceleration, angular rate, position information and road network information corresponding to the vehicle 1.
Similarly, the adjustment configuration of the target parameter corresponding to the vehicular three-dimensional animation may refer to the description of the adjustment process of the target parameter corresponding to the moving image in S207 above, and the same points are not repeated here. The difference is that in some embodiments, the electronic device 2 may also adjust the movement parameters for a three-dimensional vehicle corresponding to a three-dimensional vehicle in a three-dimensional animation of the vehicle.
For example, the acceleration in the movement parameter is obtained by multiplying the acceleration a1 corresponding to the vehicle 1 by the preset coefficient c3, and the directions of the acceleration and the preset coefficient c3 are identical. The angular rate in the movement parameter is obtained by multiplying the corresponding angular rate b1 of the vehicle 1 by a preset coefficient c 4. The values of the preset coefficient c3 and the preset coefficient c4 may be set according to actual requirements, which are not specifically limited.
In some embodiments, the road background in the three-dimensional animation of the vehicle is updated in real time according to the navigation route, i.e. the road background may be the actual street view in the navigation route or a view processed by the actual street view.
In addition, in some embodiments, the electronic device 2 may also adjust the transparency of the vehicle three-dimensional animation such that the vehicle three-dimensional animation is displayed according to the determined transparency. Alternatively, the electronic device 2 may display the three-dimensional animation of the vehicle using AR technology.
Referring to fig. 1F, the right side of fig. 1F is a schematic diagram of the electronic device 2 displaying a vehicle motion animation superimposed on the content of the on-screen display page.
And S1106, displaying normal page contents on the screen.
With continued reference to fig. 1F, the page content displayed by the electronic device 2 on the left side of fig. 1F may be normal page content.
Therefore, the mobile display method provided by the application is not limited to displaying the moving image in a superposition way on the edge of the screen, but simulates the three-dimensional visual vehicle movement, thereby being beneficial to improving the interestingness in the motion sickness prevention process. In addition, the method is not limited to acceleration and angular velocity of the vehicle 1, and data of various sensors such as a positioning chip and a magnetometer are utilized, so that accuracy of simulating actual vehicle movement by three-dimensional animation of the vehicle is improved.
The above execution sequence of S1101 to S1106 is merely illustrative, and in other embodiments, other execution sequences may be adopted, partial steps may be split or combined, and some steps may be omitted, which is not limited herein.
Next, a method for solving dizziness caused by severe shaking of the focus of the line of sight in the moving scene in S205 in the embodiment shown in fig. 2 will be described in detail. The method can track the focus change of human eyes by using a sight tracking technology, and calculate the pose change of the electronic equipment 2 relative to the last moment by using an inertial device of the electronic equipment 2 so as to solve the dizziness caused by the intense shaking of the focus of the sight when a user browses the screen of the electronic equipment 2 in a mobile scene.
It will be appreciated that gaze tracking techniques, which may also be referred to as eye movement tracking techniques, include pupil cornea tracking techniques, methods of capturing eye images, methods of calculating gaze points from retinal images, 3D modeling methods, and the like.
In a first possible implementation, the present application may employ pupil cornea tracking techniques to track the gaze focus.
The principle of pupil cornea tracking technology is to use vectors in pupil center and cornea bright spot group to represent the sight line direction. As an example, the technology can measure the distance of human eyes by using an infrared distance measuring sensor, and send infrared light by using an infrared transmitter, and further adopts an infrared camera to shoot an original infrared image of the position of the human eyes for line-of-sight tracking by means of cornea reflection light spots formed by reflection of the infrared light on the cornea of the human eyes.
In some embodiments, the infrared emitter can provide 2 or more infrared light sources to emit infrared light to human eyes, so as to improve the robustness of a vision focus tracking method based on infrared light, and solve errors caused by the position variation of the head of a user or a wearable device.
In some embodiments, in the scenario illustrated in fig. 1E, infrared ranging sensors, infrared emitters, and infrared cameras may all be provided in wearable device 3 (e.g., smart glasses), but are not limited thereto. In other embodiments, the infrared emitter may be provided in the wearable device 3, while the infrared ranging sensor and the infrared camera may be provided in the electronic device 2, such as a front-facing camera of the electronic device 2.
In a second possible implementation, the present application may employ 3D modeling methods to track line-of-sight focus. As an example, the 3D modeling method may specifically be a structured light eye tracking method. Wherein the structured light emitters illuminate a three-dimensional object such as an eye in a structured light pattern, e.g. the structured light pattern may be a rectangular lattice, a grating stripe, etc., and the structured light cameras are used to capture an image of the eye illuminated with the structured light pattern. Thus, based on the known structured light map and the observed distortion, depth information can be calculated according to an algorithm and directional information (e.g., yaw, pitch, and translation vectors) of the eye can be determined, and then the gaze direction of the eye can be derived based on the depth information and the directional information.
In some embodiments, both the structured light emitter and the structured light camera may be provided in the electronic device 2 in connection with the scenario illustrated in fig. 1A.
In other embodiments, in the scenario illustrated in fig. 1E, the structured light emitter may be provided in the wearable device 3, while the structured light camera may be provided in the electronic device 2, or may be provided in the wearable device 3, e.g. the structured light camera is a front-facing camera of the electronic device 2. At this time, the electronic device 2 interacts with the wearable device 3 to realize gaze tracking.
In the second possible implementation manner, the method for implementing gaze tracking includes the following three methods 1 to 3.
Method 1:
referring to fig. 13, a schematic diagram of the relative pose transformation between the human eye and the electronic device 2 is shown, in which the human eye and the electronic device 2 are both in a changed pose. The pose relationship between the human eye and the electronic device 2 indicated by the dashed line in fig. 13 is a first pose relationship, and the pose relationship between the human eye and the electronic device 2 indicated by the solid line in fig. 13 is a second pose relationship after the pose of both the human eye and the electronic device 2 is changed. The structured light camera 21 of the electronic device 2 can collect structured light images of the human eye. In addition, the focal point of the line of sight, in which the line of sight of the human eye falls on the screen 22 of the electronic apparatus 2, is P.
Specifically, in the scenario shown in fig. 1E in which the user wears the wearable device 3 to browse the screen of the electronic device 2, the coordinate system based on the structured light camera 21 (such as a depth camera) in the electronic device 2 is set to be F1, the coordinate system of the wearable device (such as a smart glasses) 3 is set to be F2, the eye coordinate system is set to be F3, and the pose matrix of the structured light camera 21 and the wearable device 3 is set to beThe pose matrix of the wearable device 3 and the eyes 4 is +.>The pose matrix of the structured light camera and eyes is +. >The equation of the line of sight under F3 in the F1 coordinate system is +.>Let the line of sight and the focal point of the line of sight of the screen of the electronic device 2 be P (x, y, z), then l F1 (p) =0, screen plane S F1 (P) =0, and the line-of-sight focus P can be obtained. Further, the electronic apparatus 2 can make the focus content substantially unchanged by moving the page content displayed on the screen. It will be appreciated that l F1 Is a formula based on the line of sight of a structured light camera coordinate system F1, and S F1 Is based on a knotThe formula of the screen plane of the structured light camera coordinate system F1.
In addition, in some embodiments, in the scenario illustrated in fig. 1A, the 3D modeling method may directly perform three-dimensional modeling on eye movement based on a camera corresponding to a high-performance depth camera in the electronic device 2 to implement line-of-sight tracking, or may perform processing on a captured face image based on a common camera in the electronic device 2 to implement line-of-sight tracking.
Method 2:
if the electronic device 2 has a visual line tracking function, such as a depth camera with high performance. The electronic device 2 does not need to interact with the wearable device 3, and three-dimensional modeling is directly performed on eye movements, so that the relative pose relationship of the electronic device 2 and the eye sight is obtained, and the sight focus of the sight on the screen of the electronic device 2 is obtained.
Method 3:
the electronic device 2 may photograph a face of a user, calculate the photographed face and the line of sight using a neural network model trained in advance, and estimate the line of sight direction. For example, the neural network model may be obtained by training the key points of the face in advance, modeling, deep learning, and the like, and based on the model, the pose relationship of the face and the sight line with respect to the electronic device 2 may be obtained through the photo taken by the camera, so as to obtain the change of the focus of the sight line on the screen of the electronic device 2.
As an example, in the process of capturing a face image by the electronic device 2, the inertial unit in the electronic device 2 can sense a tiny shake behavior, and timely feed back to the microprocessor in the inertial unit to calculate the compensation amount, and then solve the problem of blurring of the face image caused by shake by an optical compensation mode (image sensor, optical device, etc.), so as to improve the eye tracking performance.
In the following, in connection with fig. 14, a description will be given of a flow of implementing the content displayed on the focus adjustment screen in S205 in the embodiment shown in fig. 2, the execution subject of the flow still being the electronic device 2, and the method shown in fig. 14 includes S205a to S205, that is, S205 may be implemented by S205a to S205 c:
S205a: a focus of a line of sight of a human eye on a screen of the electronic device 2 is determined.
In some embodiments, the electronic device 2 may acquire the line-of-sight focus by using the method 2 or the method 3, which are not described herein.
In other embodiments, the electronic device 2 may interactively determine the focus of the line of sight with the wearable device 3, the specific procedure being implemented according to S205a1 to S205a4 shown in fig. 15, i.e. S205a shown in fig. 14 may be replaced with S205a1 to S205a4 shown in fig. 15.
S205a1: an image of the wearable device 3 is acquired.
In some embodiments, the electronic device 2 may capture an image of the wearable device 2 through a front-facing camera, which may be an infrared camera or a structured light camera, but is not limited thereto. For example, the electronic device 2 may acquire an image of the wearable device 3 by means of a structured light camera according to the method 1 above.
S205a2: the pose relationship of the electronic device 2 and the wearable device 3 is analyzed.
For example, the electronic device 2 may analyze the pose relationship between the electronic device 2 and the wearable device 3 according to the method 1 above to obtain the pose matrix of the structured light camera and the wearable device 3
S205a3: the pose relationship of the wearable device 3 to the eyes is determined and the gaze direction is determined.
For example, the electronic device 2 may determine the pose relationship of the wearable device 3 and the eyes according to the method 1 above, such as obtaining the pose matrix of the wearable device 3 and the eyes from the wearable device 3 asThus, the electronic device 2 can obtain the pose matrix of the camera and the eyes as +.>Line of sight under eye coordinate system F3The equation under the structured light camera coordinate system F1 is +.>I.e. the direction of the line of sight is determined.
S205a4: a focus of the line of sight direction on the screen of the electronic device 2 is determined.
For example, the electronic device 2 may make the focus of the line of sight and the line of sight of the screen of the electronic device 2 be P (x, y, z), then for l F1 (p) =0, screen plane S F1 The line-of-sight focus P can be obtained by calculating (P) =0.
In some embodiments, the electronic device 2 may periodically determine the gaze focus, such as once every 10 seconds.
S205b: it is determined whether or not the line of sight focus has changed, and if so, S205f is entered, and if not, S205a is returned.
It will be appreciated that in the event of a jerk or heave of the vehicle 1, the eye's focus of view on the screen of the electronic device 2 may change rapidly.
As an example, the electronic device 2 may analyze the change in the coordinate value of the line-of-sight focus P determined two or more times in succession, and consider that the line-of-sight focus has changed if the change is large; otherwise, the focus of the line of sight is considered unchanged. For example, the larger change of the coordinate value means that the displacement before and after the coordinate value transformation is larger than a preset displacement threshold, and the value of the displacement threshold can be determined according to the actual requirement without specific limitation.
S205c: the content of the page displayed on the screen is controlled to move so that the content in the focus of the line of sight is unchanged, namely the content of the gaze of the line of sight is unchanged.
As an example, when the bump of the vehicle 1 is detected, only the pose of the electronic device 2 is considered to be changed in a short time and the pose of the human body is not basically changed, the pose and the distance of the electronic device 2 relative to the change at the last moment can be calculated according to the inertial device, and under the condition that the sight direction of the human body is not changed, the page is adjusted, so that the content under the focus of the sight is basically unchanged, and the dizziness is relieved.
Referring to fig. 16, a schematic diagram of the pose change of the electronic device is shown. The electronic device 2 detects that the screen of the electronic device 2 is rotated by an angle alpha based on the geometric center of the electronic device 2, the picture of the page displayed on the screen may be rotated reversely by this angle to compensate for the screen rotation. If the screen shift distance d of the electronic device 2 is set, according to the certain viewing angle, the closer the human eyes see the content, the smaller the size of the page can be adjusted, and the shift of the page to the human face direction becomes smaller to increase the display content under the small viewing angle, and the reverse direction becomes larger. For example, the size adjustment formula may be (1-d/k), k is a preset value, and the larger the translation of d to the face direction is, the smaller the size of the adjustment page is; and if the direction is reversed, d is negative, and the size of the page is adjusted to be larger. It will be appreciated that when the pose of the electronic device 2 is simultaneously rotated by an angle α and translated by a distance d, the electronic device 2 may simultaneously adjust the rotation angle and size of the displayed page on the screen. Therefore, through the gesture and displacement of the electronic equipment 2, the page displayed on the screen is adjusted, the gesture change frequency of eyes and the screen is slowed down, and the dizziness is relieved.
In addition, in some other embodiments, in the moving scene, the pose of the eyes and the pose of the electronic device 2 change when the vehicle 1 bumps, and the electronic device 2 needs to determine that the pose of the electronic device 2 relative to the eyes changes to adjust the page content displayed on the screen.
Further, referring to fig. 17 in conjunction with fig. 16, a schematic diagram of a page change caused by a pose change of an electronic device is shown. The left side of fig. 17 shows the page content that the electronic device 2 displays before the vehicle 1 starts to jolt, and when the vehicle 1 starts to jolt, the electronic device 2 rotates by an angle α to the face direction and translates by a distance d to the face direction. The electronic device 2 adjusts the content of the page displayed on the screen, specifically rotates the page picture by an angle α away from the direction of the face (i.e., reversely rotates by an angle α), and obtains the smaller size m2=m1 (1-d/k) of the page 30b shown on the right side of fig. 17 from the size M1 of the page 30a shown on the left side of fig. 17 according to the adjustment formula. Meanwhile, the electronic device 2 can keep the content in the focus of the line of sight unchanged in the process of adjusting the content of the page. For example, the content at the line focus P1 in the page 30a before adjustment shown on the left side of fig. 17 is the text box 12, and the content at the line focus P1 in the page 30b after adjustment shown on the right side of fig. 17 is the text box 12, so that the dizziness caused by the drastic movement of the line focus with respect to the pose change of the human eyes and the screen is alleviated.
Next, a structure of the vehicle will be described with reference to fig. 18, and fig. 18 shows a schematic structural diagram of the vehicle according to an embodiment of the present application.
Fig. 18 is a schematic view of a possible functional framework of a vehicle 1 according to an embodiment of the present application. As shown in fig. 18, various subsystems may be included in the functional framework of the vehicle 1, such as a sensor system 10, a control system 20, one or more peripheral devices 30 (one shown in the illustration), a power supply 40, and a computer system 50 in the illustration. Alternatively, the vehicle 1 may also include other functional systems, such as an engine system for powering the vehicle 1, etc., the application is not limited herein. Wherein, the liquid crystal display device comprises a liquid crystal display device,
the sensor system 10 may include a number of sensing devices that sense the measured information and convert the sensed information to an electrical signal or other desired form of information output according to a certain law. As shown, these detection means may include, without limitation, a global positioning system 11 (global positioning system, GPS), a vehicle speed sensor 12, an inertial measurement unit 13 (inertial measurement unit, IMU), and so forth.
The global positioning system GPS11 is a system for performing positioning and navigation in real time on a global scale by using GPS positioning satellites. In the present application, the global positioning system GPS11 can be used to realize real-time positioning of the vehicle 1 and provide geographic position information of the vehicle 1. The vehicle speed sensor 12 detects a running vehicle speed of the vehicle 1. The inertial measurement unit 13 may include a combination of an accelerometer and a gyroscope, which is a device that measures the angular rate and acceleration of the vehicle 1. For example, during running of the vehicle 1, the inertia measurement unit may measure a position and an angular change of the vehicle body, etc., such as an acceleration and an angular velocity of the vehicle 1, based on inertial acceleration of the vehicle 1.
The control system 20 may include a steering unit 21, a braking unit 22, and the like.
The steering unit 21 may represent a system for adjusting the direction of travel of the vehicle 1, which may include, but is not limited to, a steering wheel, or any other structural device for adjusting or controlling the direction of travel of the vehicle 1. The brake unit 22 may represent a system for slowing the travel speed of the vehicle 1, which may also be referred to as a vehicle 1 brake system. Which may include, but is not limited to, a brake controller, a retarder or any other structural device for decelerating the vehicle 1, etc. In practice, the braking unit 22 may utilize friction to slow the vehicle 1 tires and thus the running speed of the vehicle 1.
Peripheral device 30 may include several elements such as a communication system 31, a touch screen 32, a user interface 33, and the like, as shown. Wherein the communication system 31 is used to enable network communication between the vehicle 1 and other devices than the vehicle 1, such as the electronic device 2. In practical applications, the communication system 31 may implement network communication between the vehicle 1 and other devices using wireless communication technology or wired communication technology. The wired communication technology may refer to communication between the vehicle 1 and other devices by means of a network cable or an optical fiber, or the like. The wireless communication technologies include, but are not limited to, global system for mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), wireless local area networks (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) networks), bluetooth (Bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technologies (near field communication, NFC), and infrared technologies (IR), among others.
The touch screen 32 may be used to detect operating instructions on the touch screen 32. For example, the user performs a touch operation on the content data displayed on the touch screen 32 according to the actual requirement, so as to implement a function corresponding to the touch operation, for example, playing multimedia files such as music, video, and the like. The user interface 33 may be a touch panel, for detecting operation instructions on the touch panel. The user interface 33 may also be a physical key or a mouse. The user interface 34 may also be a display screen for outputting data, displaying images or data. Optionally, the user interface 34 may also be at least one device belonging to the category of peripheral devices, such as a touch screen, a microphone, a speaker, etc.
Several functions of the vehicle 1 are controlled by the computer system 50. Computer system 50 may include one or more processors 51 (illustrated as one processor) and memory 52 (which may also be referred to as storage). In practical applications, the memory 52 may be internal to the computer system 50, or may be external to the computer system 50, for example, as a cache in the vehicle 1, and the application is not limited thereto. Wherein, the liquid crystal display device comprises a liquid crystal display device,
the processor 51 may include one or more general-purpose processors, such as a graphics processor (graphic processing unit, GPU). The processor 51 may be configured to execute the relevant programs or instructions corresponding to the programs stored in the memory 52 to implement the corresponding functions of the vehicle 1.
Memory 52 may include volatile memory (RAM), such as; the memory may also include a non-volatile memory (non-volatile memory), such as ROM, flash memory (flash memory), HDD, or solid state disk SSD; memory 52 may also include a combination of the types of memory described above. The memory 52 may be used to store a set of program codes or instructions corresponding to the program codes so that the processor 51 invokes the program codes or instructions stored in the memory 52 to implement the corresponding functions of the vehicle 1. Including but not limited to some or all of the functions in the functional framework diagram of the vehicle 1 shown in fig. 10. In the present application, the memory 52 may store a set of program codes for controlling the vehicle 1, and the processor 51 may call the program codes to control the safe driving of the vehicle 1, and how the safe driving of the vehicle 1 is achieved will be described in detail below.
Alternatively, the memory 52 may store information such as road maps, driving routes, sensor data, and the like, in addition to program codes or instructions. The computer system 50 may implement the relevant functions of the vehicle 1 in combination with other elements in the functional framework schematic of the vehicle 1, such as sensors in a sensor system, GPS, etc. For example, the computer system 50 may control the traveling direction or traveling speed of the vehicle 1, etc., based on the data input of the sensor system 10, and the present application is not limited thereto.
It should be noted that fig. 18 is only a schematic view of one possible functional framework of the vehicle 1. In practice, the vehicle 1 may include more or fewer systems or elements, and the application is not limited thereto.
As shown in fig. 19, one possible architecture diagram of the electronic device 2.
In fig. 19, similar parts have the same reference numerals. As shown in fig. 19, the electronic device 2 may include a processor 110, a power module 140, a memory 180, a camera 170, a mobile communication module 130, a wireless communication module 120, a sensor module 190, an audio module 150, an interface module 160, and a display screen 102, among others.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 2. In other embodiments of the application, the electronic device 2 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, for example, processing modules or processing circuits that may include a central processing unit (Central Processing Unit, CPU), an image processor (Graphics Processing Unit, GPU), a digital signal processor (Digital Signal Processor, DSP), a microprocessor (Micro-programmed Control Unit, MCU), an artificial intelligence (Artificial Intelligence, AI) processor, or a programmable logic device (Field Programmable Gate Array, FPGA), or the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors. For example, in some examples of the present application, the processor 110 may be configured to adjust target parameters of the moving image, the moving figure, and the three-dimensional animation of the vehicle according to the corresponding speed, angle rate, etc. of the vehicle 1, and render the moving image, the moving figure, and the three-dimensional animation of the vehicle according to the target parameters on the screen. Alternatively, in some examples of the application, the processor 110 may adjust the page content based on changes in the focus of the line of sight on the screen.
The camera 170 may be used to capture images of the wearable device 3. For example, camera 170 may be an infrared camera or a structured light camera.
The Memory 180 may be used to store data, software programs, and modules, and may be a Volatile Memory (RAM), such as a Random-Access Memory (RAM); or a nonvolatile Memory (Non-Volatile Memory), such as a Read-Only Memory (ROM), a Flash Memory (Flash Memory), a Hard Disk (HDD) or a Solid State Drive (SSD); or a combination of the above types of memories, or may be a removable storage medium, such as a Secure Digital (SD) memory card. In particular, the memory 180 may include a program storage area (not shown) and a data storage area (not shown). Program codes for causing the processor 110 to execute the mobile display method provided by the embodiment of the present application by executing the program codes may be stored in the program storage area.
The mobile communication module 130 may include, but is not limited to, an antenna, a power amplifier, a filter, a low noise amplifier (Low Noise Amplify, LNA), and the like. The mobile communication module 130 may provide a solution for wireless communication including 2G/3G/4G/5G, etc. applied on the electronic device 2. The mobile communication module 130 may receive electromagnetic waves from an antenna, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to a modem processor for demodulation. The mobile communication module 130 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 130 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 120 may include an antenna, and transmit and receive electromagnetic waves via the antenna. The wireless communication module 120 may provide solutions for wireless communication including wireless local area network (Wireless Local Area Networks, WLAN) (e.g., wireless fidelity (Wireless Fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (Global Navigation Satellite System, GNSS), frequency modulation (Frequency Modulation, FM), near field wireless communication technology (Near Field Communication, NFC), infrared technology (IR), etc., applied on the electronic device 2. The electronic device 2 may communicate with the network and other devices via wireless communication techniques, such as communicating with the vehicle 1 parameters such as acceleration and angular rate detected by the vehicle 1, and communicating with the wearable device 2 parameters such as gaze direction detected by the wearable device.
In some embodiments, the mobile communication module 130 and the wireless communication module 120 of the electronic device 2 may also be located in the same module.
It is to be understood that the hardware configuration shown in fig. 19 above does not constitute a specific limitation on the electronic apparatus 2. In other embodiments of the application, the electronic device 2 may include more or fewer components than shown in fig. 19, or certain components may be combined, certain components may be split, or a different arrangement of components may be provided.
Embodiments of the disclosed mechanisms may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as a computer program or program code that is executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For the purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope by any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) in an electrical, optical, acoustical or other form of propagated signal using the internet. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module mentioned in each device is a logic unit/module, and in physical terms, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is only a key for solving the technical problem posed by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems posed by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the application.

Claims (18)

1. A mobile display method applied to an electronic device, comprising:
displaying a first page on a screen of the electronic device;
acquiring motion parameters, wherein the motion parameters are acceleration and/or angular rate;
if the motion parameters meet a first condition, displaying a target object on the first page according to the color parameters of the first page in a superposition mode according to target parameters;
and adjusting the target parameters according to the motion parameters, and controlling the dynamic display of the target object.
2. The method of claim 1, wherein the target parameters include at least one of: color parameters, shape parameters, size parameters, position parameters, quantity, transparency, dynamic parameters indicating dynamic effects, movement parameters indicating movement speed and direction.
3. The method according to claim 2, wherein before the target object is displayed on the first page according to the target parameter in a superimposed manner according to the color parameter of the first page, further comprising:
and determining the position parameters of the target object according to the horizontal and vertical screen states of the screen of the electronic equipment.
4. A method according to claim 3, wherein the target object is a preset image or an image generated from content in the page.
5. The method of claim 4, wherein the step of displaying the target object in the first page according to the target parameter before superimposing the target object on the first page according to the color parameter of the first page further comprises:
performing screen capturing operation on the first page to obtain a first image;
and acquiring the color parameter of the first image as the color parameter of the first page.
6. The method of claim 5, wherein the acquiring the color parameters of the first image, after obtaining the color parameters of the first page, further comprises:
acquiring color parameters of the target object under the condition that the target object is the preset image;
the step of superposing and displaying the target object on the first page according to the target parameter according to the color parameter of the first page comprises the following steps:
adjusting the color parameters of the target object to the color parameters of the first page;
and displaying the target object on the first page in a superposition manner according to the target parameters.
7. The method of claim 5, wherein after performing the screen capturing operation on the first page to obtain the first image, further comprising:
Intercepting image content in a target area in the first image to obtain a second image;
and taking the second image as the target object.
8. The method of claim 2, wherein the target object is a three-dimensional animation.
9. The method of claim 8, wherein the motion parameters further comprise: at least one of direction information, position information and road network information.
10. The method according to any one of claims 1 to 9, wherein the motion parameter is a motion parameter detected by the electronic device or a motion parameter acquired by the electronic device from a vehicle.
11. The method of claim 10, wherein adjusting the target parameter according to the motion parameter, after controlling the dynamic display of the target object, further comprises:
and if the motion parameter does not meet the first condition, canceling to display the target object.
12. The method of claim 11, wherein the first condition is that the value of the acquired acceleration is greater than or equal to a first preset threshold.
13. The method according to claim 2, wherein the motion parameter includes an acceleration, and the size parameter of the target object is adjusted according to the acceleration.
14. The method of claim 12, wherein the dimensional parameter of the first page is determined based on a distance the electronic device is translated as a focus of a line of sight of a user on a screen of the electronic device changes, a direction of rotation of the first page is opposite to a direction of rotation of the screen of the electronic device, and a rotation angle of the first page is the same as the rotation angle of the screen of the electronic device.
15. The method of claim 2, wherein prior to the acquiring the motion parameter, further comprising:
establishing a communication connection with the vehicle;
determining that the electronic equipment is in a bright screen state;
determining the gesture of the electronic equipment as a preset gesture;
acquiring position information and speed information, wherein the position information and the speed information are detected by the electronic equipment or acquired by the electronic equipment from the vehicle;
and determining that the position information and the speed information meet a third condition.
16. A computer readable storage medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform the mobile display method of claims 1-15.
17. A computer program product, characterized in that the computer program product comprises instructions for implementing the mobile display method according to claims 1-15.
18. An electronic device, comprising:
a memory for storing instructions for execution by one or more processors of the electronic device, and
a processor for performing the mobile display method of claims 1-15 when the instructions are executed by one or more processors.
CN202210400220.8A 2022-04-15 2022-04-15 Mobile display method, medium, program product and electronic equipment Pending CN116954534A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210400220.8A CN116954534A (en) 2022-04-15 2022-04-15 Mobile display method, medium, program product and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210400220.8A CN116954534A (en) 2022-04-15 2022-04-15 Mobile display method, medium, program product and electronic equipment

Publications (1)

Publication Number Publication Date
CN116954534A true CN116954534A (en) 2023-10-27

Family

ID=88441354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210400220.8A Pending CN116954534A (en) 2022-04-15 2022-04-15 Mobile display method, medium, program product and electronic equipment

Country Status (1)

Country Link
CN (1) CN116954534A (en)

Similar Documents

Publication Publication Date Title
US11484790B2 (en) Reality vs virtual reality racing
US10169923B2 (en) Wearable display system that displays a workout guide
US10977774B2 (en) Information processing apparatus, information processing method, and program for estimating prediction accuracy
JP5397373B2 (en) VEHICLE IMAGE PROCESSING DEVICE AND VEHICLE IMAGE PROCESSING METHOD
JP5267660B2 (en) Image processing apparatus, image processing program, and image processing method
JP7331696B2 (en) Information processing device, information processing method, program, and mobile object
CN117058345A (en) Augmented reality display
US11410634B2 (en) Information processing apparatus, information processing method, display system, and mobile object
US20230249618A1 (en) Display system and display method
US11244496B2 (en) Information processing device and information processing method
US20190064528A1 (en) Information processing device, information processing method, and program
US10912916B2 (en) Electronic display adjustments to mitigate motion sickness
JPWO2018179305A1 (en) Travel route providing system, control method therefor, and program
CN112396705A (en) Mitigation of small field of view display using transitional visual content
EP3869302A1 (en) Vehicle, apparatus and method to reduce the occurence of motion sickness
CN116954534A (en) Mobile display method, medium, program product and electronic equipment
CN111016787A (en) Method and device for preventing visual fatigue in driving, storage medium and electronic equipment
CN117940301A (en) Display method, device and system
CN114398120A (en) Interface display method and device, storage medium and electronic equipment
US20210318560A1 (en) Information processing device, information processing method, program, and mobile object
EP4215397A1 (en) Display device, vehicle, system and related method
JP5660573B2 (en) Display control apparatus, display control method, program, and recording medium
EP4215415A1 (en) Method of operating a display device, vehicle and system
EP4215398A1 (en) Method of operating a display device, vehicle and system
JP2020019369A (en) Display device for vehicle, method and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination