WO2024002174A1 - 操作隔离方法及相关装置 - Google Patents

操作隔离方法及相关装置 Download PDF

Info

Publication number
WO2024002174A1
WO2024002174A1 PCT/CN2023/103260 CN2023103260W WO2024002174A1 WO 2024002174 A1 WO2024002174 A1 WO 2024002174A1 CN 2023103260 W CN2023103260 W CN 2023103260W WO 2024002174 A1 WO2024002174 A1 WO 2024002174A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
user
application
vehicle
central control
Prior art date
Application number
PCT/CN2023/103260
Other languages
English (en)
French (fr)
Inventor
朱蕾
李昌婷
刘立剑
张胜涛
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2024002174A1 publication Critical patent/WO2024002174A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means

Definitions

  • the present application relates to the field of vehicle technology, and in particular to operating isolation methods and related devices.
  • the central control device is a multimedia intelligent terminal device on the vehicle that integrates life entertainment, on-board navigation, assisted reversing and other functions. Any member in the car can control the display screen of the central control device, that is, the central control screen, and activate the functions provided by the central control device.
  • This application provides an operation isolation method and related devices, which can intelligently distinguish and isolate operations initiated by different users according to the source direction of the operation initiated when the user controls the electronic device, so that only users located in designated locations can control the electronic device.
  • embodiments of the present application provide an operation isolation direction.
  • the method includes: an electronic device detects a first operation initiated by a first user, and the electronic device performs a response to the first operation; and the electronic device detects a first operation initiated by a second user. the second operation; the response of the electronic device refusing to perform the second operation; wherein, the operation source direction when the first user initiates the first operation on the electronic device, and, the operation source direction when the second user initiates the second operation on the electronic device , Differently, the operation source direction indicates the position of the user who initiated the operation relative to the electronic device.
  • the electronic device can determine the user's position relative to the electronic device according to the source direction of the user's operation, and then determine whether to execute a response to the user's operation, so that when multiple users operate the electronic device,
  • the electronic device can intelligently isolate operations initiated by some users based on the user's position relative to the electronic device to prevent other users from interfering with the control of the electronic device by users at designated locations.
  • the method is applied to electronic equipment in the vehicle.
  • the electronic equipment includes a central control screen.
  • the central control screen is located in the middle of the console in front of the driver's seat and the passenger seat in the vehicle.
  • the electronic device detects The first operation initiated by the first user specifically includes: the electronic device detects the first operation initiated by the first user that acts on the central control screen or a physical button, and the physical button is located in the middle of the console.
  • the physical button can be placed below the central control screen or on the edge of the screen of the central control screen, and the physical button can include one or more.
  • the position of the physical button and the central control screen makes the driver and co-driver Passengers can control the central control device through the physical buttons or the central control screen.
  • this method can be used for the driver and co-driver passengers to control the central control equipment in the car.
  • this method can distinguish whether the user who initiates the operation to the central control device in the car is the driver or the co-driver passenger, ensuring that only the driver or the co-driver passenger can control the central control device, and this method is not affected by people changing seats.
  • the user is identified only based on the source direction of the operation initiated to the central control device, and the identification of whether the user is a driver or a passenger is more flexible.
  • the method further includes: the electronic device determines the first source direction, and the first source The direction is the direction the first user points to the central control screen or the physical button when initiating the first operation.
  • the first source direction indicates the first user's position relative to the central control screen; the electronic device determines the first user's position relative to the central control screen. It's the first position and not the second position.
  • inventions of the present application provide another operation isolation method.
  • the method is applied to electronic equipment in a vehicle.
  • the electronic equipment includes a central control screen.
  • the method includes: the electronic equipment detects that the first user acts on the central control screen or For the first operation of physical buttons, both the central control screen and the physical buttons are located in the middle of the operating console in front of the driver's seat and passenger seat in the vehicle; the electronic device determines the first source direction, and the first source direction is when the first user initiates the first operation.
  • point in the direction of the central control screen or physical buttons, and the first source direction indicates the first user's position relative to the central control screen; the electronic device determines that the first user's position relative to the central control screen is the first position and not the third position. In the second position, the electronic device performs a response to the first operation.
  • Implementing the method provided in the second aspect can determine the source of the operation when the driver and co-driver passengers initiate user operations on the central control screen device. Different directions are used to identify whether the user who initiates an operation on the central control device in the car is the driver or the co-pilot passenger, ensuring that the central control device only responds to user operations initiated by the driver or co-pilot passenger, and this method is not affected by seat changes. , the user is identified only based on the source direction of the operation initiated to the central control device, and the identification of whether the user is a driver or a passenger is more flexible.
  • the method further includes: the electronic device determines that the first user's position relative to the central control screen is not the first position, and the electronic device refuses to perform a response of the first operation.
  • the electronic device determines the first source direction, specifically including:
  • the electronic device determines the first source direction based on the pointing direction of the finger when the first user initiates the first operation and/or the image collected by the camera including the first user's limbs.
  • the source direction of the operation can be determined based on the direction of the finger pointing when the user initiates the touch operation, or when the operation initiated by the user is a touch operation acting on the central control screen
  • the camera can be controlled to capture the user's body parts when the user initiates the operation, and the direction of the operation source can be determined based on the user's limb orientation.
  • the electronic device can combine the finger pointing direction and the image collected by the camera. , to more accurately determine the source direction of the operation and achieve more reliable user identification.
  • the camera can be a camera of an electronic device, or a camera of other devices, or an independent camera.
  • the electronic device can send a shooting instruction to the camera to trigger the camera to capture an image.
  • the pointing direction of the finger is determined by the electronic device according to the shape of the touch surface on the central control screen and the fingerprint under the touch surface when the first user initiates the first operation. One or more of the pattern and the pressure change under the touch surface are determined.
  • the electronic device can collect a series of information when the user initiates the touch operation, such as the shape of the touch surface, fingerprints, etc. through sensor devices such as fingerprint sensors, touch sensors, pressure sensors, etc. Patterns, pressing pressure, etc., more accurately determine the direction of the user's finger when initiating an operation.
  • the method further includes:
  • the electronic device determines that the vehicle state of the vehicle is the first state, and/or the first application of the first operation function is a specified application or an application of a specified type.
  • the electronic device can also determine whether to respond to the user's operation based on the state of the vehicle and/or the application operated by the user. For example, when the driver and co-pilot passenger both intend to activate the gear switching function, the central control device can only respond to the driver's operation to activate the gear switching function according to the source direction of the operation to avoid other passengers interfering with the driver's control of the vehicle. Ensure vehicle safety. For another example, during the operation of the vehicle, the central control device is prohibited from responding to the driver's operation of activating the life and entertainment functions, so as to prevent the driver from being distracted while the vehicle is driving and ensure the driving safety of the vehicle.
  • the first state is a stationary state or a driving state
  • the designated type includes one or more of the following: driving assistance type, vehicle control type, and life entertainment type.
  • the electronic device performs a response to the first operation, specifically including: the electronic device starts the first application or performs the first function of the first application.
  • the electronic device can determine whether the user can use the application installed in the electronic device or the functions provided by the application based on the user's identity, so that the electronic device can only provide application services to users in designated locations.
  • the electronic device refuses to perform a response to the second operation, specifically including: the electronic device refuses to launch the first application or refuses to perform the first function, or refuses to launch the second application or refuses to perform the second operation. Secondary function of the application.
  • the first operation initiated by the first user and the second operation initiated by the second user may be operations initiated simultaneously, and the electronic device may only respond to an operation initiated by one user, or the first operation initiated by the first user and the second operation initiated by the second user may be initiated simultaneously.
  • the second operation initiated by the two users may also be operations initiated at different times, and the embodiment of the present application does not limit this.
  • the electronic device when there are multiple users intending to use the same application in the electronic device, the electronic device only allows users at specified locations to use the application, or when there are multiple users intending to use different applications in the electronic device,
  • the electronic device can respectively control the usage permissions of users at different locations for different applications. For example, a user at a first location can use the first application, but a user at a second location cannot use the second application.
  • the electronic device can adjust the first position, the first state, the specified application or the specified type of application, thereby achieving more flexible operation isolation.
  • embodiments of the present application provide an electronic device, including a memory, one or more processors, and one or more programs; when one or more processors execute one or more programs, the electronic device The device performs the first aspect or any one of the first aspects Embodiment, the second aspect or the method described in any embodiment of the second aspect.
  • embodiments of the present application provide a vehicle, characterized in that the vehicle includes a central control panel as in the first aspect or the second aspect, a memory, one or more processors, and one or more programs; a When the processor or multiple processors execute one or more programs, the vehicle implements the method described in the first aspect or any implementation of the first aspect, the second aspect or any implementation of the second aspect.
  • embodiments of the present application provide a vehicle, which is characterized in that the vehicle includes a central control screen and physical buttons as in the first or second aspect, a memory, one or more processors, and one or more Program; when one or more processors execute one or more programs, the vehicle implements the first aspect or any implementation of the first aspect, the second aspect or any implementation of the second aspect.
  • embodiments of the present application provide a computer-readable storage medium, including instructions.
  • the instructions When the instructions are run on an electronic device, the computer executes the first aspect or any one of the implementations of the first aspect.
  • the second aspect Or the method described in any embodiment of the second aspect.
  • the operation isolation method provided by the embodiment of the present application can identify the identity role of the user based on the source direction of the operation initiated by the user on the electronic device, and then determine whether to respond to the operation triggered by the user. In this way, the identity role of the user and the user's location can be identified. associated with the location. When the user's location changes, the user's identity role also changes accordingly. This provides a more flexible user identification method and enables the electronic device to only respond to operations initiated by the user at the specified location. .
  • Figure 1 is a schematic diagram of an in-car scene provided by an embodiment of the present application.
  • FIG. 2 is a schematic flowchart of the operation isolation method provided by the embodiment of the present application.
  • Figure 3 is a schematic diagram of a user initiating a touch operation on an electronic device according to an embodiment of the present application
  • Figure 4 is a division diagram of the angle between the finger direction and the horizontal plane provided by the embodiment of the present application.
  • Figure 5 is a schematic diagram of the shape of the touch surface provided by the embodiment of the present application.
  • Figure 6 is a schematic diagram of a fingerprint pattern provided by an embodiment of the present application.
  • Figure 7 is a schematic diagram of a user initiating a touch operation and pressing the display screen according to an embodiment of the present application
  • Figure 8 is an image including the user's limbs collected by the electronic device provided by the embodiment of the present application.
  • FIG. 9 is a schematic diagram of the hardware structure of the electronic device provided by the embodiment of the present application.
  • Figure 10 is a schematic diagram of the software structure of the electronic device provided by the embodiment of the present application.
  • first and second are used for descriptive purposes only and shall not be understood as implying or implying relative importance or implicitly specifying the quantity of indicated technical features. Therefore, the features defined as “first” and “second” may explicitly or implicitly include one or more of the features. In the description of the embodiments of this application, unless otherwise specified, “plurality” The meaning is two or more.
  • the central control device is a powerful multimedia intelligent terminal device in the car. Multiple members in the car, such as the driver and co-driver passengers, can directly control the central control screen of the central control device to activate corresponding functions. However, for some functions of the central control device, it may be necessary to distinguish the members who can use the function. For example, the gear shifting function should be restricted to the driver to prevent passengers from using the function to interfere with the driver's control of the vehicle. For another example, while the vehicle is driving, the life and entertainment functions should be limited to passengers. This is to avoid the driver's use of this function while driving causing distraction, which may lead to driving accidents.
  • both the driver and the front passenger seat can directly control the central control device.
  • the driver and co-pilot passenger can initiate user operations on the central control screen and control the central control equipment through the user operations.
  • the driver and co-pilot passenger They are located on both sides of the central control screen, so when the driver and co-pilot passenger initiate user operations on the central control device, there is a natural feature that the operations come from different directions.
  • Figure 1 exemplarily shows a schematic diagram of an in-car scene.
  • both the driver and the co-driver passenger in the car can control the central control device.
  • the driver When the driver is located on the left side of the central control screen, the source direction of the user's operation is to the left, and the co-driver passenger is located on the left side of the central control screen.
  • the source direction of the user's operation When it is on the right, the source direction of the user's operation is on the right. Therefore, the identity role of the user can be identified based on the source direction of the user operation when the user operation is initiated.
  • the electronic device can detect the user operation acting on the first application, determine the source direction of the user operation based on the user operation, and determine the source direction of the user operation based on the user operation.
  • the source direction determines the identity role of the user, and then determines whether the first application executes a response to the user operation based on the identity role, where the response may refer to starting the first application or executing the first function of the first application.
  • the electronic device may refer to a mobile phone, a tablet, a computer, etc.
  • the electronic device may specifically refer to a central control device or vehicle including a central control screen.
  • the embodiment of the present application does not limit the specific type of the electronic device.
  • the user operation may refer to a touch operation acting on the display screen of the electronic device, or may refer to a pressing operation acting on a mechanical key of the electronic device.
  • the embodiment of the present application does not limit the expression form of the user operation.
  • the electronic device can determine the source direction of the user's operation based on the touch surface that acts on the display screen when the user initiates the touch operation. For example, the user's finger direction can be determined based on the fingerprint pattern, pressure level, touch shape and other information under the touch surface, and then the source direction of the user's operation can be determined.
  • the user operation is a pressing operation
  • the electronic device can determine the source direction of the user operation through the user's limb orientation collected by the camera. For a specific description of how the electronic device determines the source direction of the user's operation, please refer to the following content and will not be elaborated here.
  • the electronic device may further determine whether the first application performs a response to the user operation in combination with the application type of the first application and/or the current scene.
  • the scenario may refer to the vehicle state, such as a stationary state or a driving state. Specific descriptions of application types and vehicle status can be found in the following content, and will not be discussed here.
  • the method provided by the embodiment of the present application can determine whether to execute a response to the user operation according to the source direction of the user operation, so that when there are multiple users controlling the electronic device, the electronic device can determine whether to execute the response according to the user's position relative to the electronic device. Location, intelligently isolate some user-initiated operations.
  • the method provided by the embodiment of the present application can control the central control device to only respond to user operations initiated by a designated user. For example, when the driver and co-pilot passenger both intend to activate the gear switching function, the central control device can only respond to the driver's operation to activate the gear switching function according to the source direction of the operation to avoid other passengers interfering with the driver's control of the vehicle. Ensure vehicle safety. For another example, during the operation of the vehicle, the central control device is prohibited from responding to the driver's operation of activating the life and entertainment functions, so as to prevent the driver from being distracted while the vehicle is driving and ensure the driving safety of the vehicle.
  • FIG. 2 shows a schematic flowchart of the operation isolation method provided by the embodiment of the present application.
  • the method includes:
  • the electronic device detects a user operation on the first application.
  • the user operation may refer to a touch operation on the touch screen of the electronic device, or may refer to a pressing operation on a physical button of the electronic device.
  • the embodiment of the present application does not limit the expression form of the user operation.
  • the touch screen When the user operates a touch operation on the touch screen of the electronic device, the touch screen is fixed at a certain position, and users at different positions relative to the touch screen can initiate touch operations on the touch screen from different directions.
  • the user operation is a pressing operation on a physical button of the electronic device
  • the physical button is fixed at a certain position, and users at different positions relative to the physical button can initiate pressing operations on the physical button from different directions.
  • the touch screen may refer to the central control screen of the central control device.
  • the central control screen is located in the middle of the console in front of the driver's seat and the front passenger seat, that is, between the driver and the front passenger.
  • the physical button can also be located in the middle of the console in front of the driver's seat and passenger seat in the car.
  • an electronic device may contain both a touch screen and physical buttons.
  • the physical buttons are located near the touch screen.
  • the embodiments of the present application do not limit the number of physical keys.
  • the electronic device may include multiple physical keys, and these physical keys may surround the touch screen. The embodiment of the present application does not limit the relative position of the touch screen and the touch screen.
  • the user operation can be used to trigger the electronic device to start the first application, display the user interface provided by the first application, or start the function of the first application.
  • a desktop may be displayed on the touch screen of the electronic device, and the desktop may include application icons of multiple applications.
  • the user operation may refer to a click operation of the user's finger on an application icon of a certain application. The click operation may be used for Trigger to launch the application.
  • a physical button of an electronic device can be used to start a music application.
  • the user operation may refer to a pressing operation by the user on the physical button.
  • the electronic device can start a music application and play music.
  • the electronic device determines the operation source direction based on the user operation.
  • the user operation may be an operation initiated by the first user that acts on the first application. It may also be an operation initiated by the second user that affects the first application.
  • the first user and the second user respectively refer to two users whose relative positions are different from the electronic device.
  • the first user is located on the left side of the electronic device, and the second user is located on the right side of the electronic device.
  • the first user initiates a pair interaction.
  • the user operation originates from the left side of the electronic device.
  • the user operation originates from the right side of the electronic device.
  • the operation source direction indicates the direction in which the user who initiated the user operation points to the electronic device, that is, it indicates the position of the user relative to the electronic device. Furthermore, when the user operation is specifically an operation that acts on the central control screen or physical buttons, the operation source direction indicates the direction in which the user who initiated the user operation points to the central control screen or physical buttons, that is, it indicates the direction of the user relative to the central control screen. s position.
  • the operation source directions are different.
  • the operation source direction is the first source direction
  • the second user initiates the second operation.
  • the operation source direction is the second source direction.
  • the first source direction indicates the position of the first user relative to the electronic device
  • the second source direction indicates the position of the second user relative to the electronic device.
  • the first source direction and the second source direction are different, and the first source direction represents the source of the user's operation.
  • the second source direction indicates that the user operation originates from the right side of the electronic device.
  • the electronic device can determine the direction of the operation source in the following two ways:
  • the user operation is a touch operation on the touch screen of the electronic device.
  • the electronic device can determine the user's finger direction based on the touch shape, fingerprint pattern, pressure level and other information under the touch surface, and then determine the direction of the operation source.
  • the finger direction can refer to the direction in which the finger tip points.
  • FIG. 3 shows a schematic diagram of a user initiating a touch operation on an electronic device.
  • FIG. 3 shows a touch surface A acting on the electronic device when the user is located on the left front of the electronic device, and another touch surface B acting on the electronic device when the user is located on the right front of the electronic device.
  • the finger direction indicated by the touch surface A is the upper right
  • the finger direction indicated by the touch surface B is the upper left.
  • the electronic device can determine the direction of the operation source based on the angle between the finger direction and the horizontal plane. For example, when the angle between the finger direction and the horizontal plane is within the first range, the electronic device determines the operation source direction as the first direction, and when the angle between the finger direction and the horizontal plane is within the second range, the electronic device determines the operation source direction. is the second direction.
  • the first range is different from the second range, and the first direction is different from the second direction.
  • the first range can be 0°-90°
  • the second range can be 90°-180°
  • the first direction is the right side of the electronic device
  • the second direction is the left side of the electronic device.
  • first range and the second range may be fixed or variable.
  • first range and the second range may be preset ranges.
  • the electronic device can continuously self-learn and adjust the first range and the second range according to the user's operation. This is due to the differences in height, body shape and operating habits of different users, which may cause the angle between the finger direction and the horizontal plane to change when the user initiates a user operation. It may not be accurate enough to determine the direction of the operation source using the preset range. , therefore it is necessary to dynamically adjust the first range and the second range according to some factors, such as the user's height, weight, position, arm length, etc., so that the electronic device can determine the user's identity and role more accurately.
  • the electronic device can determine the direction of the user's finger in the following three ways:
  • Figure 5 shows a schematic diagram of the shape of the touch surface. It can be seen from Figure 5 that when a finger acts on the touch screen, the lateral width of the touch surface gradually widens from the finger tip to the finger pulp. That is, the shape of the touch surface is usually an ellipse with a narrow top and a wide bottom as shown in Figure 4 .
  • the electronic device can determine the direction in which the wide side points to the narrow side as the finger direction based on the characteristics of the touch surface being narrow and wide at the bottom.
  • the determination of the finger direction is not limited to the characteristics of the touch surface being narrow at the bottom and wide at the bottom.
  • the finger direction can be determined based on the curvature of the boundary curve of the touch surface. The greater the curvature of the curve, the greater the degree of curvature of the curve. The smaller the curvature of the curve, the smaller the degree of curvature of the curve. You can The direction of the point with the smallest curvature pointing to the point with the largest curvature is determined as the finger direction.
  • the embodiments of this application do not limit the specific implementation method used when determining the finger direction based on the shape of the touch surface.
  • the direction of the finger can also be determined through the fingerprint pattern.
  • Figure 6 is a schematic diagram of a fingerprint pattern.
  • the lines in the fingerprint pattern that are close to straight lines can be called root baselines, and the center of the fingerprint lines can be called the center point.
  • the direction in which the side of the root baseline points to the center point can be determined as the finger direction.
  • the method of using the root baseline and the center point in the fingerprint pattern to determine the finger direction is only an example and does not constitute a limitation on the embodiments of the present application. In other embodiments of the present application, other fingerprint patterns can also be used. The method to determine the finger direction will not be described here.
  • the pressing pressure When a user touches the display screen, the pressing pressure generally decreases gradually from between the fingers to the fingertips.
  • Figure 7 shows a schematic diagram in which a user initiates a touch operation and presses the display screen. As shown in Figure 7, the direction in which the pressure increases can be determined as the finger direction.
  • the finger direction is not limited to the above three ways to determine the finger direction.
  • the finger direction can be determined by combining the above multiple ways, for example, by combining the shape of the touch surface and the fingerprint pattern. Orient your finger. In this way, a more accurate finger direction can be determined.
  • the embodiment of the present application does not limit the method of determining the finger direction.
  • the camera can capture the body parts of the user extending toward the electronic device from different directions.
  • the body parts can be the user's arms, palms, etc.
  • the electronic device can determine the direction of the operation source based on the image collected by the camera.
  • the camera can be a camera that comes with the electronic device.
  • the camera can be a hole-punch camera, an under-screen camera, etc.
  • the embodiment of the present application does not limit the type and location of the camera.
  • the camera can be a camera of other electronic devices, and the electronic device can collect images through the cameras of other electronic devices when detecting user operations.
  • the camera can be an independent camera.
  • the electronic device detects user operation, it can send a shooting instruction to the camera and control the camera to trigger taking pictures.
  • the electronic device when detecting a user operation, can control the camera to trigger a photo, obtain an image containing the user's limbs, and determine the direction of the operation source when the user initiates the user operation by identifying the direction of the limbs in the image. Furthermore, the electronic device can also determine the specific location pointed by the user's operation through the image. In this way, when there are multiple users initiating operations at the same time, the electronic device can identify which of the multiple users the received multiple operations are specifically initiated by based on the different locations where the different operations are performed.
  • the camera is an under-screen camera under a touch screen of an electronic device
  • application icons of multiple applications may be displayed on the touch screen of the electronic device.
  • the electronic device collects through the camera that two users initiate touch operations on the touch screen, the touch operation initiated by user 1 located on the left side of the touch screen acts on the upper left corner of the touch screen, and the touch operation initiated by user 2 located on the right side of the touch screen Acting on the upper right corner of the touch screen, in the image collected by the electronic device, there is a partial image of the hand pointing from the left to the upper left corner, and a partial image of the hand pointing from the right to the upper right corner.
  • the electronic device can determine whether to start the application corresponding to the application icon displayed in the upper left corner of the touch screen clicked by User 1 or to start the application icon displayed in the upper right corner of the touch screen clicked by User 2 based on the limb orientation of User 1 in the image and the limb orientation of User 2.
  • the application corresponding to the application icon displayed in the corner can be determined whether to start the application corresponding to the application icon displayed in the upper left corner of the touch screen clicked by User 1 or to start the application icon displayed in the upper right corner of the touch screen clicked by User 2 based on the limb orientation of User 1 in the image and the limb orientation of User 2.
  • Figure 8 shows an image collected by an electronic device (ie, the central control device) when two users initiate an operation on the central control screen, including the body parts of the two users. .
  • an electronic device ie, the central control device
  • Gesture 1 comes from the left side of the electronic device.
  • Gesture 1 indicates that the finger of the user who made the gesture points to the upper left corner.
  • Gesture 2 comes from the left side of the electronic device.
  • Gesture 2 indicates that the finger of the user making the gesture is pointing toward the upper right corner.
  • Gesture 1 is a user operation initiated by the driver
  • Gesture 2 is a user operation initiated by the co-pilot passenger.
  • the electronic device determines the direction of the operation source based on the orientation of the user's limbs collected by the camera
  • the user operation may be a touch operation or a pressing operation on a button.
  • the electronic device can trigger the camera to collect images when detecting user operations.
  • the electronic device can also determine the operation source direction based on whether the driver's hands leave the steering wheel, for example, When the electronic device detects the user's operation and detects that the driver's right hand has left the steering wheel through the sensor on the steering wheel, the electronic device determines that the direction of the operation source is the left. Alternatively, the electronic device can also detect nearby objects based on the proximity light sensor.
  • the electronic device can determine that the source direction of the operation is the left side.
  • the electronic device can also determine the direction of the operation source by combining the touch surface that acts on the touch screen when the user initiates the user operation, and the orientation of the user's limbs collected by the camera, so that the direction of the operation source determined by the electronic device is more accurate.
  • This application implements For example, there is no restriction on the way in which electronic devices determine the direction of the operation source based on user operations.
  • the electronic device determines the identity role of the user based on the operation source direction.
  • the electronic device can determine the identity role of the user based on the positional relationship of the user relative to the electronic device.
  • the user's identity role is different.
  • the user's identity role may include: driver and co-pilot passenger.
  • the driving position that is, on the left side of the electronic device
  • the user's identity role is the driver.
  • the co-pilot position that is, When located on the right side of the electronic device, the user's identity role is that of a co-pilot passenger.
  • the user's identity role is not limited to the driver and co-pilot passenger mentioned above, and there is no absolute binding relationship between the operation source direction and the user's identity role.
  • the user's identity when the operation source direction is left, the user's identity The role does not necessarily have to be the driver.
  • the operation source direction is to the right, the user's identity role does not necessarily need to be the co-pilot passenger.
  • the specific position should be determined based on the relative positions of the driver and co-pilot passenger in the car. For example, some countries or regions stipulate that the driver's position is on the right, and some countries or regions stipulate that the driver's position is on the left. That is to say, when determining the user's identity role based on the direction of the operation source, the roads in different countries or regions can be combined. Driving rules are used to determine this.
  • the electronic device may not perform step S103.
  • the electronic device may determine whether the operation source direction of the first operation is a specified direction.
  • the specified direction may be a direction within a certain range, for example, the direction of the user's finger when the direction of the user's finger is between 0° and 90° with the horizontal plane.
  • the electronic device obtains vehicle status.
  • the vehicle state can include two states: driving state and stationary state.
  • the electronic device may analyze whether to perform a response to the user operation based on the vehicle status.
  • the electronic device may determine the vehicle status based on the vehicle's speed, location, and other information. For example, when the speed of the vehicle is greater than zero, the vehicle state is determined to be a driving state, and when the vehicle speed is equal to zero, the vehicle state is determined to be a stationary state. For another example, when the position of the vehicle changes in real time, the vehicle state is determined to be the driving state; when the vehicle position does not change, the vehicle state is determined to be the stationary state.
  • the vehicle state is not limited to the above-mentioned driving state and stationary state.
  • it may include an autonomous driving state, a non-autonomous driving state, etc.
  • the electronic device can also determine the vehicle state in other ways, such as based on the conditions in the vehicle.
  • the working status of each device, whether the windows are open, fuel level, battery power and other information are used to determine the vehicle status.
  • the embodiments of this application do not limit the way in which the electronic device obtains the vehicle status.
  • step S104 is an optional step, and the electronic device does not need to obtain the vehicle status.
  • the electronic device determines the application type of the first application.
  • the electronic device may analyze whether to perform a response to the user operation according to the application type of the first application.
  • application types can include: driving assistance, vehicle control, life and entertainment, etc.
  • driving assistance applications can include: navigation applications, vehicle location display applications, reversing image applications, etc.
  • Vehicle control applications Applications may include: air conditioning setting applications, car door control applications, car window control applications, driving mode switching applications, gear switching applications, etc.
  • Life and entertainment applications may include: audio and video playback applications, game applications, phone applications, etc.
  • the electronic device may also determine whether the first application is a specified application. That is to say, in addition to whether the first application is an application of a specified type, the electronic device can also analyze whether to perform a response to the user operation based on whether the first application is a specified application.
  • step S105 is an optional step, and the electronic device may also be unsure of the application type of the first application.
  • the electronic device determines whether the first application executes a response to the user operation according to the isolation policy.
  • Isolation policies can include the user's identity roles as specified identity roles. Specifically, when the electronic device determines that the identity role of the user is a specified identity role, the electronic device may control the first application to execute a response to the user operation; otherwise, refuse to execute a response to the user operation. Change In other words, since the user's identity role is different when the user's position relative to the electronic device is different, the user's identity role is a designated identity role. It may also mean that the user's location is a designated location, such as being in the first location instead of the second location. .
  • the electronic device can detect user operations initiated by multiple users, such as a first operation initiated by a first user, a second operation initiated by a second user, and an operation when the first user initiates the first operation to the electronic device.
  • the source direction is different from the operation source direction when the second user initiates the second operation on the electronic device. For example, assuming that the first user is located on the left side of the electronic device and the second user is located on the right side of the electronic device, the operation source direction when the first user initiates the first operation is to the left side of the electronic device, and the operation source direction when the second user initiates the second operation is The source direction is the right side of the electronic device.
  • the electronic device executes a response to the first operation and refuses to execute a response to the second operation.
  • first operation and the second operation may be operations initiated to the electronic device at the same time, or may not be initiated to the electronic device at the same time.
  • the isolation policy may also include: the vehicle state is a specified state (for example, the first state), and/or the application type is a specified type. Specifically, when the electronic device determines that the user's identity role is the specified identity role and the vehicle status is the specified status, the first application is controlled to execute the response to the user operation; otherwise, the response to the user operation is refused to be executed, or when the electronic device When it is determined that the user's identity role is the specified identity role and the application type is the specified type, control the first application to execute the response of the user operation; otherwise, refuse to execute the response of the user operation, or when the electronic device determines that the user's identity role is When the identity role is specified, the vehicle status is the specified state, and the application type is the specified type, control the first application to execute the response to the user operation; otherwise, refuse to execute the response to the user operation.
  • the vehicle state is a specified state
  • the application type is the specified type
  • the isolation policy may include that the user's identity role is not the specified identity role, the vehicle status is not the specified status, and the application type is not the specified type.
  • the electronic device can control the first application to execute the response to the user operation; otherwise, refuse to execute the response to the user operation.
  • the vehicle status is similar to the application type, which will not be described again here. .
  • the electronic device can determine whether the user can use the first application based on whether one or more of the user's identity role, vehicle status, and application type meet the preset requirements.
  • the electronic device may determine whether the first application performs a response to the user operation based only on the user's identity role. For example, when the user's identity role is a passenger, the electronic device determines that the first application does not perform a response to the user operation. In this way, the user in the passenger seat cannot use the electronic device.
  • the electronic device may determine whether the first application performs a response to the user operation based on the user's identity role and application type. For example, when the user's identity role is a passenger and the first application is a vehicle control application, the electronic device determines that the first application does not respond to the user operation. In this way, the effect can be achieved that the user in the passenger seat cannot use vehicle control applications, and the passenger in the passenger seat can be prevented from interfering with the driver's control of the vehicle through the electronic device. For another example, when the user's identity role is a driver and the first application is a driving assistance application, the electronic device determines that the user can use the first application. In this way, the driver can better control the vehicle by using driving assistance applications in electronic devices.
  • the electronic device may determine whether the first application performs a response to the user operation based on the user's identity role, vehicle status, and application type. For example, when the user's identity role is a driver, the first application is a lifestyle application, and the vehicle status is a driving state, the electronic device determines that the first application does not respond to the user operation. In this way, the user in the driver's seat cannot use life and entertainment applications while the vehicle is driving, thereby preventing the driver from being distracted and causing driving accidents.
  • the electronic device when the electronic device only determines whether the first application performs a response to the user operation based on the user's identity role, the electronic device does not perform steps S104 and S105. Similarly, when the electronic device only determines whether the first application performs a response to the user operation based on the user's identity role, When the electronic device determines whether the first application executes the response to the user operation based on the vehicle status, the electronic device does not execute step S105. When the electronic device only determines whether the first application executes the user operation based on the user's identity role and the application type of the first application. In response, the electronic device does not execute step S104.
  • the isolation policy can be a policy imported into the electronic device in advance by the developer, or it can also be a policy set by the user using the electronic device. Furthermore, when the electronic device uses the isolation policy to control the response of the application to different users, the user can also update or adjust the isolation policy.
  • the embodiment of this application does not limit the source of the isolation policy.
  • the electronic device determines that the first application performs a response to the user operation according to the isolation policy, the electronic device performs step S107; otherwise, the electronic device performs step S108.
  • the electronic device controls the first application to execute a response to the user operation.
  • the electronic device controls the first application to execute a response to the user operation, specifically referring to the electronic device controlling the first application to execute the first function of the first application, or the electronic device displaying the first user interface of the first application, and so on.
  • the user operation may be an operation to start the navigation application
  • the electronic device's response to executing the user operation may mean that the electronic device displays the user interface provided by the navigation application, or the user operation may Refers to the operation of activating the navigation function of the navigation application.
  • the response of the electronic device to perform the user operation can be This refers to the electronic device starting the navigation function.
  • the electronic device controls the first application to perform a response to the user operation according to the isolation policy, it means that the user operation meets the requirements.
  • the electronic device can respond to the user operation and perform the corresponding operation. the response to.
  • the control electronic device only responds to operations initiated by the user at the designated location, which can prevent users at other locations from interfering with operations initiated by the user at the designated location, thereby ensuring the driving safety of the vehicle and providing a more intelligent in-car service.
  • the electronic device controls the first application to refuse to respond to the user operation.
  • the electronic device controls the first application to refuse to respond to the user operation, which specifically refers to a response in which the electronic device does not perform the user operation.
  • the electronic device's refusal to respond to the user operation means that the electronic device does not start the first application, or when the user operation is to start the operation of the first function of the first application.
  • the electronic device refuses to respond to the user operation it means that the electronic device does not activate the first function.
  • controlling electronic devices to intercept operations initiated by users at designated locations can ensure that users at designated locations cannot control electronic devices and prevent users at designated locations from interfering with the driving of the vehicle, thereby ensuring the driving safety of the vehicle.
  • the electronic device can detect user operations initiated by multiple users, and the electronic device can respectively determine the source directions of the user operations of the multiple users, and determine the identity roles of the multiple users, And determine whether to respond to the user's operation according to the user's identity role. Further, the electronic device can also determine the vehicle status, and/or the application type of the application affected by the multiple user operations, combined with the vehicle status, and /Or, apply the type to determine whether to respond to the user's operation. For example, the electronic device may detect a first operation initiated by a first user that acts on a first application, and a second operation initiated by a second user that acts on a second application. The direction of the operation source determines the identity roles of the first user and the second user.
  • the response of the first application to perform the first operation is determined. Based on the identity role of the second user, it is determined that the second application does not execute the operation. Response to the second operation.
  • the electronic device can detect the operations of multiple users at the same time and determine whether to perform responses to the multiple operations, for example, only respond to the operations of one user, or respond to the operations of multiple users simultaneously or one after another. Or, refuse to respond to the operations of these multiple users, etc.
  • the method provided by the embodiments of the present application can distinguish operations initiated by users in different locations based on the user's location when the user controls the electronic device, and the method can identify the user's location based on the user's location.
  • the identity role is not affected by the user's seat change, and the identification of the user's identity role is more flexible and highly reliable.
  • the method provided by the embodiments of the present application is not limited to application in vehicle scenarios. It should be understood that this method can be applied to any user operating an electronic device in multiple different locations, and needs to be divided according to the user's location. The embodiment of this application does not limit the scenario in which the user controls the authority of the electronic device.
  • FIG. 9 shows a schematic diagram of the hardware structure of the electronic device 100.
  • the electronic device 100 may be a mobile phone, a tablet computer, a desktop computer, a laptop computer, a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a cellular phone, a personal digital assistant (personal digital assistant) digital assistant (PDA), augmented reality (AR) device, virtual reality (VR) device, artificial intelligence (AI) device, wearable device, vehicle-mounted device, smart home device and/or Smart city equipment, the embodiment of this application does not place special restrictions on the specific type of electronic equipment.
  • the electronic device 100 may be the aforementioned electronic device.
  • the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display screen 194, and Subscriber identification module (SIM) card interface 195, etc.
  • a processor 110 an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2 , mobile communication module 150, wireless communication module 160, audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone interface 170D, sensor module 180, button 190, motor 191, indicator 192, camera 193, display
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, and ambient light.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100 .
  • the electronic device 100 may include more or fewer components than shown in the figures, or some components may be combined, some components may be separated, or some components may be arranged differently.
  • the components illustrated may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units.
  • the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (GPU), and an image signal processor. (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • application processor application processor, AP
  • modem processor graphics processing unit
  • GPU graphics processing unit
  • image signal processor image signal processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural network processor neural-network processing unit
  • the controller can generate operation control signals based on the instruction operation code and timing signals to complete the control of fetching and executing instructions.
  • the processor 110 may also be provided with a memory for storing instructions and data.
  • the charging management module 140 is used to receive charging input from the charger.
  • the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
  • the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, the wireless communication module 160, and the like.
  • the power management module 141 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 100 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization. For example: Antenna 1 can be reused as a diversity antenna for a wireless LAN. In other embodiments, antennas may be used in conjunction with tuning switches.
  • the mobile communication module 150 can provide solutions for wireless communication including 2G/3G/4G/5G applied on the electronic device 100 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 150 can receive electromagnetic waves through the antenna 1, perform filtering, amplification and other processing on the received electromagnetic waves, and transmit them to the modem processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modem processor and convert it into electromagnetic waves through the antenna 1 for radiation.
  • at least part of the functional modules of the mobile communication module 150 may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 and at least part of the modules of the processor 110 may be provided in the same device.
  • a modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
  • the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the application processor outputs sound signals through audio devices (not limited to speaker 170A, receiver 170B, etc.), or displays images or videos through display screen 194.
  • the modem processor may be a stand-alone device.
  • the modem processor may be independent of the processor 110 and may be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 100 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), Bluetooth (bluetooth, BT), and global navigation satellites.
  • WLAN wireless local area networks
  • System global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , demodulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110, frequency modulate it, amplify it, and convert it into electromagnetic waves through the antenna 2 for radiation.
  • the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi) -zenith satellite system (QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is an image processing microprocessor and is connected to the display screen 194 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 194 is used to display images, videos, etc.
  • Display 194 includes a display panel.
  • the electronic device may include 1 or N display screens 194, where N is a positive integer greater than 1.
  • the display screen 194 may be used to display a user interface, such as the user interface of the first application, etc.
  • the display screen 194 may refer to the central control screen of the central control device.
  • the central control screen can only provide a display function, or the central control screen can also detect the user's touch operation.
  • the central control screen is a touch screen.
  • the electronic device 100 can implement the shooting function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 193. For example, when taking a photo, the shutter is opened, the light is transmitted to the camera sensor through the lens, the optical signal is converted into an electrical signal, and the camera sensor passes the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on image noise and brightness. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • Camera 193 is used to capture still images or video.
  • the object passes through the lens to produce an optical image that is projected onto the photosensitive element.
  • the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other format image signals.
  • the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
  • the electronic device 100 can use the camera 193 to collect the user's body parts when the user initiates a user operation, and the electronic device 100 can determine the direction of the operation source according to the orientation of the user's body parts.
  • the electronic device 100 determining the direction of the operation source based on the orientation of the user's limb collected by the camera please refer to the relevant content of the aforementioned step S102, and will not be described again here.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the electronic device 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 100 may support one or more video codecs. In this way, the electronic device 100 can play or record videos in multiple encoding formats, such as moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural network (NN) computing processor.
  • NN neural network
  • Intelligent cognitive applications of the electronic device 100 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, etc.
  • the internal memory 121 may include one or more random access memories (RAM) and one or more non-volatile memories (NVM).
  • RAM random access memories
  • NVM non-volatile memories
  • the random access memory can be directly read and written by the processor 110, can be used to store executable programs (such as machine instructions) of the operating system or other running programs, and can also be used to store user and application data, etc.
  • the non-volatile memory can also store executable programs and user and application program data, etc., and can be loaded into the random access memory in advance for direct reading and writing by the processor 110.
  • the external memory interface 120 can be used to connect an external non-volatile memory to expand the storage capacity of the electronic device 100 .
  • the external non-volatile memory communicates with the processor 110 through the external memory interface 120 to implement the data storage function. For example, save music, video and other files in external non-volatile memory.
  • the electronic device 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playback, recording, etc.
  • the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals. Audio module 170 may also be used to encode and decode audio signals.
  • Speaker 170A also called “speaker” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 can listen to music through the speaker 170A, or listen to hands-free calls.
  • Receiver 170B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the electronic device 100 answers a call or a voice message, the voice can be heard by bringing the receiver 170B close to the human ear.
  • Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak close to the microphone 170C with the human mouth and input the sound signal to the microphone 170C.
  • the electronic device 100 may be provided with at least one microphone 170C.
  • the headphone interface 170D is used to connect wired headphones.
  • the pressure sensor 180A is used to sense pressure signals and can convert the pressure signals into electrical signals.
  • pressure sensor 180A may be disposed on display screen 194 .
  • pressure sensors 180A such as resistive pressure sensors, inductive pressure sensors, capacitive pressure sensors, etc.
  • a capacitive pressure sensor may include at least two parallel plates of conductive material.
  • the electronic device 100 may determine, according to the pressure sensor 180A, changes in the pressure intensity of the touch surface on the electronic device 100 when the user initiates a touch operation. Specifically, the electronic device 100 may determine the direction in which the intensity of pressure gradually increases on the touch surface as the finger direction. For a specific description of the electronic device 100 determining the finger direction according to changes in pressure, please refer to the relevant content of the aforementioned step S102, which will not be described again here.
  • the gyro sensor 180B may be used to determine the motion posture of the electronic device 100 .
  • the angular velocity of electronic device 100 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
  • Air pressure sensor 180C is used to measure air pressure. In some embodiments, the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
  • Magnetic sensor 180D includes a Hall sensor.
  • the electronic device 100 may utilize the magnetic sensor 180D to detect opening and closing of the flip holster.
  • the acceleration sensor 180E can detect the acceleration of the electronic device 100 in various directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of electronic devices and be used in horizontal and vertical screen switching, pedometer and other applications.
  • Distance sensor 180F for measuring distance.
  • Electronic device 100 can measure distance via infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 may utilize the distance sensor 180F to measure distance to achieve fast focusing.
  • Proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
  • the light emitting diode may be an infrared light emitting diode.
  • the electronic device 100 emits infrared light outwardly through the light emitting diode.
  • Electronic device 100 uses photodiodes to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100 . When insufficient reflected light is detected, the electronic device 100 may determine that there is no object near the electronic device 100 .
  • the electronic device 100 can use the proximity light sensor 180G to detect when the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to save power.
  • the proximity light sensor 180G can also be used in holster mode, and pocket mode automatically unlocks and locks the screen.
  • the electronic device 100 can determine the direction of the operation source when the user initiates a user operation by approaching the light sensor 180G. For example, when the user operation originates from the left side of the electronic device 100, the electronic device 100 can determine the direction of the operation source by approaching the light sensor 180G. 180G detects the presence of an object in the left front of the electronic device 100. When the user operation originates from the right side of the electronic device 100, the electronic device 100 can detect the presence of an object in the right front of the electronic device 100 through the proximity light sensor 180G.
  • the ambient light sensor 180L is used to sense ambient light brightness.
  • the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
  • the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in the pocket to prevent accidental touching.
  • Fingerprint sensor 180H is used to collect fingerprints.
  • the electronic device 100 can use the collected fingerprint characteristics to achieve fingerprint unlocking, access to application locks, fingerprint photography, fingerprint answering of incoming calls, etc.
  • the electronic device 100 can collect the fingerprint pattern when the user initiates a user operation through the fingerprint sensor 180H, and then determine the finger direction based on the fingerprint pattern.
  • the electronic device 100 determining the finger direction based on the fingerprint pattern please refer to the relevant content of the aforementioned step S102, which will not be described again here.
  • Temperature sensor 180J is used to detect temperature.
  • the electronic device 100 utilizes the temperature detected by the temperature sensor 180J to execute the temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of a processor located near the temperature sensor 180J in order to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to prevent the low temperature from causing the electronic device 100 to shut down abnormally. In some other embodiments, when the temperature is lower than another threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
  • Touch sensor 180K also known as "touch device”.
  • the touch sensor 180K can be disposed on the display screen 194.
  • the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near the touch sensor 180K.
  • the touch sensor can pass the detected touch operation to the application processor to determine the touch event type.
  • Visual output related to the touch operation may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a location different from that of the display screen 194 .
  • the electronic device 100 may detect a touch operation on the first application through the touch sensor 180K.
  • Bone conduction sensor 180M can acquire vibration signals.
  • the bone conduction sensor 180M can acquire the vibration signal of the vibrating bone mass of the human body's vocal part.
  • the bone conduction sensor 180M can also contact the human body's pulse and receive blood pressure beating signals.
  • the bone conduction sensor 180M can also be provided in an earphone and combined into a bone conduction earphone.
  • Audio module 170 may be based on the bone conduction sensor
  • the vibration signal of the voice vibrating bone obtained by 180M is analyzed to obtain the speech signal to realize the speech function.
  • the application processor can analyze the heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M to implement the heart rate detection function.
  • the vehicle speed sensor 180N can obtain the vehicle's driving speed.
  • the electronic device 100 may determine the vehicle status based on the driving speed acquired by the vehicle speed sensor 180N.
  • the vehicle status obtained by the electronic device 100 please refer to the relevant content of the aforementioned step S104, which will not be described again here.
  • the buttons 190 include a power button, a volume button, etc.
  • Key 190 may be a mechanical key. It can also be a touch button.
  • the electronic device 100 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 100 . In some embodiments, the electronic device 100 may detect a user operation on the first application through the key 190 .
  • the motor 191 can generate vibration prompts.
  • the motor 191 can be used for vibration prompts for incoming calls and can also be used for touch vibration feedback.
  • the indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, or may be used to indicate messages, missed calls, notifications, etc.
  • the SIM card interface 195 is used to connect a SIM card.
  • the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
  • the electronic device 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100 .
  • the electronic device may be a fixed central control device in the vehicle, may refer to a vehicle that includes a central control screen, or may further include physical buttons.
  • the electronic device may be a portable terminal device equipped with iOS, Android, Microsoft or other operating systems, such as a mobile phone, a tablet, a wearable device, etc., or a laptop computer (Laptop) with a touch-sensitive surface or touch panel.
  • Non-portable terminal devices such as desktop computers with touch-sensitive surfaces or touch panels.
  • the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
  • FIG. 10 is a schematic diagram of the software structure of the electronic device 100 according to the embodiment of the present application.
  • the layered architecture divides the software into several layers, and each layer has clear roles and division of labor.
  • the layers communicate through software interfaces.
  • the mobile operating system is divided into four layers, from top to bottom: application layer, program framework layer/core service layer, underlying library and runtime, and kernel layer.
  • the application layer can include a series of application packages.
  • the application package may include driving assistance applications, vehicle control applications, life entertainment applications and other applications.
  • the program framework layer provides application programming interfaces (application programming interface, API) and programming frameworks for applications in the application layer.
  • the program framework layer includes some predefined functions.
  • the program framework layer can include a feature collection module, a role recognition module, an access control module, a scene recognition module, etc.
  • the feature collection module can be used to determine the operation source direction based on user operations, and send the operation source direction to the role recognition module.
  • the operation source direction indicates the direction of the user who initiated the user operation relative to the electronic device 100 .
  • the electronic device 100 determining the operation source direction according to the user operation please refer to the relevant content of the aforementioned step S102, and will not be described again here.
  • the role identification module can be used to determine the user's identity role based on the operation source direction and send the identity role to the access control module.
  • the electronic device 100 determining the user's identity role based on the operation source direction, please refer to the relevant content of the aforementioned step S103, which will not be described again here.
  • the scene recognition module can be used to determine the vehicle status and send the vehicle status to the access control module.
  • the vehicle state includes a driving state and a stationary state.
  • the relevant content of the aforementioned step S104 which will not be described again here.
  • the access control module may be used to determine the application type of the application (for example, the first application) that the user operates on, and determine whether the application performs a response to the user operation based on one or more of the application type, the user's identity role, and the vehicle status. If so, the access control module sends the user operation to the application.
  • the user operation can be used to trigger starting the application, or starting the first function of the application, etc. Otherwise, the access control module does not send the user operation to the application. application.
  • the access control module sends the instruction information of the operation to the driver.
  • Auxiliary applications are also used to determine the application type of the application that the user operates on, and determine whether the application performs a response to the user operation based on one or more of the application type, the user's identity role, and the vehicle status. If so, the access control module sends the user operation to the application.
  • the user operation can be used to trigger starting the application, or starting the first function of the application, etc. Otherwise, the access control module does not send
  • the program framework layer may not include an access control module.
  • the role recognition module determines the user's identity role and the scene recognition module determines the vehicle status
  • the instruction information of the user's operation, the user's identity role, and the vehicle status may be , sent to the application that is the role of the user's operation, and the application determines whether to execute the response to the user's operation based on the user's identity role, vehicle status, and application type of the application.
  • the program framework layer The scene recognition module may not be included.
  • the functional modules included in the program framework layer are only illustrative examples and do not constitute a limitation on the embodiments of the present application. In other embodiments of the present application, the program framework layer may include more or fewer functional modules.
  • the above multiple functional modules are also It can be split and combined arbitrarily, or the above-mentioned multiple functional modules can also be set in other software architecture layers.
  • Runtime can refer to all code libraries, frameworks, etc. required when the program is running.
  • the runtime includes a series of function libraries required to run C programs.
  • the runtime in addition to the core library, the runtime also includes the virtual machine required to run the Java program.
  • the above-mentioned core library may include functional functions that need to be called by the Java language.
  • the underlying library can include multiple functional modules. For example: surface manager (surface manager), media libraries (Media Libraries), 3D graphics processing libraries (for example: OpenGL ES), 2D graphics engines (for example: SGL), etc.
  • surface manager surface manager
  • media libraries Media Libraries
  • 3D graphics processing libraries for example: OpenGL ES
  • 2D graphics engines for example: SGL
  • the surface manager is used to manage the display subsystem and provides the fusion of 2D and 3D layers for multiple applications.
  • the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
  • the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
  • the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, composition, and layer processing.
  • 2D Graphics Engine is a drawing engine for 2D drawing.
  • the kernel layer is the layer between hardware and software.
  • the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
  • the workflow of the electronic device software and hardware is exemplified below by combining the capture of the photographing scene.
  • the corresponding hardware interrupt is sent to the kernel layer.
  • the kernel layer processes touch operations into raw input events (including touch coordinates, timestamps of touch operations, and other information). Raw input events are stored in the kernel layer.
  • the application framework layer obtains the original input event from the kernel layer and identifies the control corresponding to the input event. Taking the touch operation as a touch click operation and the control corresponding to the click operation as a camera application icon control as an example, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer. Camera 193 captures still images or video.
  • each step in the above method embodiment can be completed by an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the method steps disclosed in conjunction with the embodiments of this application can be directly implemented by a hardware processor, or executed by a combination of hardware and software modules in the processor.
  • This application also provides an electronic device, which may include a memory and a processor.
  • the memory can be used to store computer programs; the processor can be used to call the computer program in the memory, so that the electronic device executes the method executed by the electronic device 100 in any of the above embodiments.
  • the present application also provides a chip system, which includes at least one processor for implementing the functions involved in the method performed by the electronic device 100 in any of the above embodiments.
  • This application also provides a vehicle, which includes a central control screen, a memory, a plurality of processors, and one or more programs; the one or more processors are used to implement the electronic device 100 in the above embodiment. method of execution.
  • This application also provides another vehicle, which includes a central control screen and physical buttons, memory, multiple processors, and one or more programs; the one or more processors are used to implement the above embodiments.
  • a method performed by the electronic device 100 is also provided.
  • the chip system further includes a memory, the memory is used to store program instructions and data, and the memory is located within the processor or outside the processor.
  • the chip system can be composed of chips or include chips and other discrete devices.
  • processors in the chip system there may be one or more processors in the chip system.
  • the processor can be implemented in hardware or software.
  • the processor may be a logic circuit, an integrated circuit, or the like.
  • the processor may be a general-purpose processor implemented by reading software code stored in memory.
  • the memory may be integrated with the processor or may be provided separately from the processor, which is not limited by the embodiments of the present application.
  • the memory may be a non-transient processor, such as a read-only memory ROM, which may be integrated with the processor on the same chip, or may be separately provided on different chips.
  • the embodiments of this application vary on the type of memory, and The arrangement of the memory and processor is not specifically limited.
  • the chip system can be a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), or a system on chip (SoC). It can also be a central processor (central processor unit, CPU), a network processor (network processor, NP), a digital signal processing circuit (digital signal processor, DSP), or a microcontroller (micro controller unit (MCU), or a programmable logic device (PLD) or other integrated chip.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • SoC system on chip
  • CPU central processor unit
  • NP network processor
  • DSP digital signal processing circuit
  • MCU microcontroller
  • PLD programmable logic device
  • the present application also provides a computer program product.
  • the computer program product includes: a computer program (which may also be called code, or refers to (order), when the computer program is run, the computer is caused to execute the method executed by any one of the electronic devices 100 in any of the above embodiments.
  • This application also provides a computer-readable storage medium that stores a computer program (which may also be called a code, or an instruction).
  • a computer program which may also be called a code, or an instruction.
  • the computer program When the computer program is run, the computer is caused to perform the method performed by any one of the electronic devices 100 in any of the above embodiments.
  • the processor in the embodiment of the present application may be an integrated circuit chip with signal processing capabilities.
  • each step of the above method embodiment can be completed through an integrated logic circuit of hardware in the processor or instructions in the form of software.
  • the above-mentioned processor can be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (AP 800plication specific integrated circuit, ASIC), a field programmable gate array (field programmable gate array, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • FPGA field programmable gate array
  • a general-purpose processor may be a microprocessor or the processor may be any conventional processor, etc.
  • the steps of the method disclosed in conjunction with the embodiments of the present application can be directly implemented by a hardware decoding processor, or executed by a combination of hardware and software modules in the decoding processor.
  • the software module can be located in random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers and other mature storage media in this field.
  • the storage medium is located in the memory, and the processor reads the information in the memory and completes the steps of the above method in combination with its hardware.
  • the embodiment of the present application also provides a device.
  • the device may specifically be a component or module, and the device may include one or more connected processors and memories. Among them, memory is used to store computer programs. When the computer program is executed by one or more processors, the device is caused to execute the methods in each of the above method embodiments.
  • the devices, computer-readable storage media, computer program products or chips provided by the embodiments of the present application are all used to execute the corresponding methods provided above. Therefore, the beneficial effects it can achieve can be referred to the beneficial effects in the corresponding methods provided above, and will not be described again here.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
  • the computer instructions may be stored in or transmitted from one computer-readable storage medium to another, e.g., the computer instructions may be transferred from a website, computer, server, or data center Transmission to another website, computer, server or data center through wired (such as coaxial cable, optical fiber, digital subscriber line) or wireless (such as infrared, wireless, microwave, etc.) means.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more available media integrated therein.
  • the available media may be magnetic media (eg, floppy disk, hard disk, magnetic tape), optical media (eg, DVD), or semiconductor media (eg, solid state disk (SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请公开了操作隔离方法及相关装置,该方法可以根据用户发起对电子设备的用户操作时,该用户操作的来源方向,来判断用户的身份角色,进而根据该身份角色判断电子设备是否响应于该用户操作。这样,在存在多个用户操控电子设备时,电子设备可以根据用户相对于电子设备的位置,智能区分并隔离不同用户发起的操作,保证仅位于指定位置的用户能够操控电子设备。

Description

操作隔离方法及相关装置
本申请要求于2022年06月30日提交中国专利局、申请号为202210764282.7、申请名称为“操作隔离方法及相关装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及车辆技术领域,尤其涉及操作隔离方法及相关装置。
背景技术
随着车辆的智能化发展,车辆中配备的设备越来越多。其中,中控设备是车辆上集生活娱乐、车载导航、辅助倒车等功能于一体的多媒体智能终端设备。车内的任意一个成员都可以操控中控设备的显示屏,即中控屏,启动中控设备提供的功能。
发明内容
本申请提供了操作隔离方法及相关装置,实现了根据用户操控电子设备时,发起操作的来源方向,来智能区分并隔离不同用户发起的操作,实现仅位于指定位置的用户能够操控电子设备。
第一方面,本申请实施例提供了一种操作隔离方向,该方法包括:电子设备检测到第一用户发起的第一操作,电子设备执行第一操作的响应;电子设备检测到第二用户发起的第二操作;电子设备拒绝执行第二操作的响应;其中,第一用户向电子设备发起第一操作时的操作来源方向,与,第二用户向电子设备发起第二操作时的操作来源方向,不同,操作来源方向指示了发起操作的用户相对于电子设备的位置。
实施本申请实施例提供的方法,电子设备可以根据用户操作的来源方向,来确定用户相对于电子设备的位置,进而判断是否执行该用户操作的响应,使得在存在多个用户操控电子设备时,电子设备能够根据用户相对于电子设备的位置,智能隔离部分用户发起的操作,避免其他用户干扰指定位置的用户操控电子设备。
结合第一方面,在一些实施方式中,方法应用于车辆中的电子设备,电子设备包括中控屏,中控屏位于车辆内驾驶位和副驾驶位前方的操作台的中间,电子设备检测到第一用户发起的第一操作,具体包括:电子设备检测到第一用户发起的作用于中控屏或物理按键的第一操作,物理按键位于操作台的中间。其中,该物理按键可以置于中控屏的下方,也可以置于中控屏的屏幕边缘,并且该物理按键可以包括一个或多个,物理按键和中控屏的位置,使得驾驶员和副驾乘客都可以通过该物理按键或中控屏操控中控设备。
也就是说,该方法可以运用在驾驶员和副驾乘客对车内中控设备的操控上。在车载场景中,该方法可以区分向车内中控设备发起的操作的用户为驾驶员还是副驾乘客,保证仅驾驶员或副驾乘客能够操控中控设备,且该方法不受人员换座的影响,仅根据向中控设备发起操作的来源方向来识别用户,对用户是驾驶员还是副驾乘客的身份的识别更具有灵活性。
结合第一方面,在一些实施方式中,电子设备检测到第一用户发起的作用于中控屏或物理按键的第一操作之后,该方法还包括:电子设备确定第一来源方向,第一来源方向为第一用户发起第一操作时指向中控屏或物理按键的方向,第一来源方向指示了第一用户相对于中控屏的位置;电子设备确定第一用户相对于中控屏的位置是第一位置并且不是第二位置。
可以看出,在车内仅指定身份的用户才能够操控中控设备,避免了副驾乘客干扰驾驶员对中控设备的操控,或者,车内指定身份的用户不能操控中控设备,使得驾驶员只能将注意力放在开车上,不能分心操控中控设备,保证车辆驾驶的安全。
第二方面,本申请实施例提供了另一种操作隔离方法,方法应用于车辆中的电子设备,电子设备包括中控屏,该方法包括:电子设备检测到第一用户作用于中控屏或物理按键的第一操作,中控屏和物理按键都位于所处车辆内驾驶位和副驾驶位前方的操作台的中间;电子设备确定第一来源方向,第一来源方向为第一用户发起第一操作时指向中控屏或物理按键的方向,第一来源方向指示了第一用户相对于中控屏的位置;电子设备确定第一用户相对于中控屏的位置是第一位置并且不是第二位置,电子设备执行第一操作的响应。
实施第二方面提供的方法,可以根据驾驶员和副驾乘客向中控屏设备发起用户操作时,其操作来源方 向的不同,来识别出向车内中控设备发起操作的用户为驾驶员还是副驾乘客,保证中控设备仅响应于驾驶员或副驾乘客发起的用户操作,且该方法不受人员换座的影响,仅根据向中控设备发起操作的来源方向来识别用户,对用户是驾驶员还是副驾乘客的身份的识别更具有灵活性。
结合第二方面,在一些实施方式中,该方法还包括:电子设备确定第一用户相对于中控屏的位置不是第一位置,电子设备拒绝执行第一操作的响应。
结合第一方面和第二方面,在一些实施方式中,电子设备确定第一来源方向,具体包括:
电子设备根据第一用户发起第一操作时,手指的指向方向和/或摄像头采集的包含第一用户的肢体部位的图像,确定第一来源方向。
当用户发起的操作为作用于中控屏的触摸操作时,可以根据用户发起触摸操作时的手指指向方向来确定操作的来源方向,或者,当用户发起的操作为作用于中控屏的触摸操作或作用于物理按键的按压操作时,可以控制摄像头拍摄用户发起操作时,用户的肢体部位,根据用户的肢体朝向来确定操作的来源方向,或者,电子设备可以结合手指指向方向以及摄像头采集的图像,来更加精准的确定操作的来源方向,实现更加可靠的用户身份识别。
其中,该摄像头可以为电子设备的摄像头,或,其他设备的摄像头,或,独立的摄像头,电子设备可以在检测到用户的操作时,向该摄像头发送拍摄指令,触发摄像头拍摄图像。
结合第一方面和第二方面,在一些实施方式中,手指的指向方向,由电子设备根据第一用户发起第一操作时,作用在中控屏上的触摸面的形状、触摸面下的指纹图案和触摸面下的按压压力变化中的一项或多项确定。
当用户发起的操作为作用于中控屏的触摸操作时,电子设备可以通过指纹传感器、触摸传感器、压力传感器等等传感器设备采集用户发起触摸操作时的一系列信息,例如触摸面的形状、指纹图案、按压压力大小等等,更加精准的确定用户发起操作时的手指指向方向。
结合第一方面和第二方面,在一些实施方式中,电子设备确定第一用户相对于中控屏的位置是第一位置并且不是第二位置之后,方法还包括:
电子设备确定车辆的车辆状态为第一状态,和/或,第一操作作用的第一应用为指定应用或指定类型的应用。
也就是说,电子设备除了判断用户所在的位置是否为指定位置之外,还可以结合车辆的状态,和/或,用户操作的应用来判断是否响应于用户的操作。例如,当驾驶员和副驾乘客都意图启动档位切换功能,中控设备可以根据操作的来源方向,仅响应于驾驶员启动档位切换功能的操作,避免其他乘客干扰驾驶员对车辆的控制,保证车辆的安全。又例如,在车辆运行过程中,中控设备禁止响应于驾驶员启动生活娱乐功能的操作,避免驾驶员在车辆行驶过程中分散注意力,保证车辆的驾驶安全。
结合第一方面和第二方面,在一些实施方式中,第一状态为静止状态或行驶状态,指定类型包括以下一项或多项:驾驶辅助类、车辆控制类、生活娱乐类。
结合第一方面和第二方面,在一些实施方式中,电子设备执行第一操作的响应,具体包括:电子设备启动第一应用或执行第一应用的第一功能。
也就是说,电子设备可以根据用户的身份,来决定用户是否能够使用电子设备中安装的应用,或应用提供的功能,实现电子设备仅为指定位置的用户提供应用服务。
结合第一方面,在一些实施方式中,电子设备拒绝执行第二操作的响应,具体包括:电子设备拒绝启动第一应用或拒绝执行第一功能,或者,拒绝启动第二应用或拒绝执行第二应用的第二功能。
其中,第一用户发起的第一操作和第二用户发起的第二操作可以为同时发起的操作,电子设备可以仅响应于一个用户发起的操作,或者,第一用户发起的第一操作和第二用户发起的第二操作也可以为不同时发起的操作,本申请实施例对此不做限制。
可以看出,当存在多个用户意图使用电子设备中的同一个应用时,电子设备仅允许指定位置的用户使用该应用,或者,当存在多个用户分别意图使用电子设备中的不同应用时,电子设备可以分别控制位于不同位置的用户对不同应用的使用权限,例如位于第一位置的用户能够使用第一应用,位于第二位置的用户不能使用第二应用。
结合第一方面和第二方面,在一些实施方式中,电子设备可以调整第一位置、第一状态、指定应用或指定类型的应用,从而实现更加灵活的操作隔离。
第三方面,本申请实施例提供了一种电子设备,包括存储器,一个或多个处理器,以及一个或多个程序;一个或多个处理器在执行一个或多个程序时,使得该电子设备执行如第一方面或第一方面的任意一种 实施方式,第二方面或第二方面的任意一种实施方式所描述的方法。
第四方面,本申请实施例提供了一种车辆,其特征在于,车辆包括如第一方面或第二方面的中控屏,存储器,一个或多个处理器,以及一个或多个程序;一个或多个处理器在执行一个或多个程序时,使得车辆实现如第一方面或第一方面的任意一种实施方式,第二方面或第二方面的任意一种实施方式所描述的方法。
第五方面,本申请实施例提供了一种车辆,其特征在于,车辆包括如第一方面或第二方面的中控屏和物理按键,存储器,一个或多个处理器,以及一个或多个程序;一个或多个处理器在执行一个或多个程序时,使得车辆实现如第一方面或第一方面的任意一种实施方式,第二方面或第二方面的任意一种实施方式所描述的方法。
第六方面,本申请实施例提供一种计算机可读存储介质,包括指令,当指令在电子设备上运行时,使得计算机执行如第一方面或第一方面的任意一种实施方式,第二方面或第二方面的任意一种实施方式所描述的方法。
本申请实施例提供的操作隔离方法能够根据用户发起对电子设备的操作的来源方向,来识别用户的身份角色,进而确定是否响应于该用户触发的操作,这样,将用户的身份角色与用户所在的位置相关联,当用户所在的位置发生变化时,用户的身份角色也相应发生改变,提供了一种更加灵活的用户身份识别方式,且实现了电子设备仅响应于指定位置的用户发起的操作。
附图说明
图1为本申请实施例提供的一种车内场景示意图;
图2为本申请实施例提供的操作隔离方法的流程示意图;
图3为本申请实施例提供的用户向电子设备发起触摸操作的示意图;
图4为本申请实施例提供的手指方向与水平面的夹角的划分图;
图5为本申请实施例提供的触摸面的形状示意图;
图6为本申请实施例提供的一种指纹图案的示意图;
图7为本申请实施例提供的一种用户发起触摸操作,按压显示屏的示意图;
图8为本申请实施例提供的电子设备采集的包含用户肢体部位的图像;
图9为本申请实施例提供的电子设备的硬件结构示意图;
图10为本申请实施例提供的电子设备的软件结构示意图。
具体实施方式
下面将结合附图对本申请实施例中的技术方案进行清楚、详尽地描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;文本中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况,另外,在本申请实施例的描述中,“多个”是指两个或多于两个。
以下,术语“第一”、“第二”仅用于描述目的,而不能理解为暗示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括一个或者更多个该特征,在本申请实施例的描述中,除非另有说明,“多个”的含义是两个或两个以上。
中控设备作为车内功能强大的多媒体智能终端设备,车内多个成员,例如驾驶员和副驾乘客都可以通过直接操控该中控设备的中控屏,启动相应的功能。但是,对于中控设备的部分功能,可能需要区分能够使用该功能的成员。例如,档位切换功能应仅限于驾驶员使用,这是为了避免乘客使用该功能干扰驾驶员操控车辆。又例如,车辆行驶过程中,生活娱乐功能应仅限于乘客使用,这是为了避免驾驶员在开车时使用该功能造成注意力分散,进而引发驾驶事故。
因此,如何根据用户的身份角色,来确定是否响应于该用户触发的操作,是目前亟待解决的问题。
由于中控设备的中控屏位于车内中控台所在的位置,即驾驶位和副驾驶位前方操作台的中间的位置,驾驶员和副驾乘客都可以直接操控该中控设备,对于支持触摸操作的中控设备,驾驶员和副驾乘客都可以发起作用在中控屏的用户操作,通过该用户操作操控中控设备。其中,需要注意的是,驾驶员和副驾乘客 所在的位置分别位于中控屏的两侧,那么,驾驶员和副驾乘客向中控设备发起用户操作时,天然存在操作的来源方向不同的特征。
图1示例性示出了一种车内场景示意图。
如图1所示,车内的驾驶员和副驾乘客都可以操控中控设备,其中,驾驶员位于中控屏的左侧时,该用户操作的来源方向为左边,副驾乘客位于中控屏的右侧时,该用户操作的来源方向为右边。因此,可以根据发起用户操作时,该用户操作的来源方向,来识别用户的身份角色。
基于上述推导过程,本申请实施例提供了一种操作隔离方法,在该方法中,电子设备可以检测到作用于第一应用的用户操作,根据该用户操作确定该用户操作的来源方向,并根据该来源方向确定用户的身份角色,进而根据该身份角色判断第一应用是否执行该用户操作的响应,其中,该响应可以是指启动第一应用或执行第一应用的第一功能。
其中,该电子设备可以是指手机、平板、电脑等等设备,在车载场景中,该电子设备可以具体是指包含中控屏的中控设备或车辆。本申请实施例对该电子设备的具体类型不作限制。
该用户操作可以是指作用于电子设备的显示屏的触摸操作,也可以是指作用于电子设备的机械按键的按压操作,本申请实施例对该用户操作的表现形式不作限制。其中,当用户操作为触摸操作时,电子设备可以根据用户发起触摸操作时,作用在显示屏上的触摸面,来确定用户操作的来源方向。示例性地,可以根据触摸面下的指纹图案、压力大小、触摸形状等等信息,来确定用户的手指方向,进而确定用户操作的来源方向。当用户操作为按压操作时,电子设备可以通过摄像头采集的用户肢体朝向,来确定用户操作的来源方向。具体关于电子设备确定用户操作的来源方向的描述可以参见后续内容,这里先不展开。
在一些实施例中,电子设备还可以进一步结合第一应用的应用类型,和/或,当前的场景来判断第一应用是否执行该用户操作的响应。在车载场景中,该场景可以是指车辆状态,例如静止状态或行驶状态。具体关于应用类型、以及车辆状态的描述可以参见后续内容,这里先不展开。
可以看出,本申请实施例提供的方法能够根据用户操作的来源方向,来判断是否执行该用户操作的响应,使得在存在多个用户操控电子设备时,电子设备能够根据用户相对于电子设备的位置,智能隔离部分用户发起的操作。具体地,在车载场景中,本申请实施例提供的方法能够控制中控设备仅响应于指定用户发起的用户操作。例如,当驾驶员和副驾乘客都意图启动档位切换功能,中控设备可以根据操作的来源方向,仅响应于驾驶员启动档位切换功能的操作,避免其他乘客干扰驾驶员对车辆的控制,保证车辆的安全。又例如,在车辆运行过程中,中控设备禁止响应于驾驶员启动生活娱乐功能的操作,避免驾驶员在车辆行驶过程中分散注意力,保证车辆的驾驶安全。
图2示出了本申请实施例提供的操作隔离方法的流程示意图。
如图2所示,该方法包括:
S101.电子设备检测到作用于第一应用的用户操作。
该用户操作可以是指作用于电子设备的触摸屏的触摸操作,也可以是指作用于电子设备的物理按键的按压操作,本申请实施例对该用户操作的表现形式不作限制。
其中,当该用户操作为作用于电子设备的触摸屏的触摸操作时,该触摸屏固定于某一位置,相对于该触摸屏不同位置的用户,都可以从不同方向发起作用在该触摸屏的触摸操作。当该用户操作为作用于电子设备的物理按键的按压操作时,该物理按键固定于某一位置,相对于该物理按键不同位置的用户,都可以从不同方向发起作用在该物理按键的按压操作。示例性地,在车载场景下,该触摸屏可以是指中控设备的中控屏,该中控屏位于车内驾驶位和副驾驶位前方操作台的中间,即位于驾驶员和副驾乘客的中间,该物理按键同样也可以位于车内驾驶位和副驾驶位前方操作台的中间。
应理解,电子设备可以同时包含触摸屏和物理按键。示例性地,当电子设备同时包含触摸屏和物理按键时,该物理按键位于该触摸屏的附近。另外,本申请实施例对物理按键的数量不做限制,例如,电子设备可以包含多个物理按键,这些物理按键可以围绕在触摸屏的周围。本申请实施例对触摸屏和触摸屏的相对位置不做限制。
该用户操作可用于触发电子设备启动第一应用,显示第一应用提供的用户界面,或者,启动第一应用的功能。例如,电子设备的触摸屏中可以显示有桌面,桌面中可以包括多个应用程序的应用图标,该用户操作可以是指用户的手指作用于某一应用的应用图标的点击操作,该点击操作可用于触发启动该应用。又例如,电子设备的物理按键可用于启动音乐应用,该用户操作可以是指用户作用于该物理按键的按压操作,响应于该操作,电子设备可以启动音乐应用,播放音乐。
S102.电子设备根据该用户操作确定操作来源方向。
在本申请实施例中,实际物理空间中可以存在多个用户,例如第一用户,第二用户操控该电子设备,因此,该用户操作可以为第一用户发起的作用于第一应用的操作,也可以为第二用户发起的作用于第一应用的操作。其中,该第一用户和第二用户分别是指与电子设备的相对位置不同的两个用户,例如,第一用户位于电子设备的左边,第二用户位于电子设备的右边,第一用户发起对第一应用的用户操作时,该用户操作来源于电子设备的左边,第一用户发起对第一应用的用户操作时,该用户操作来源于电子设备的右边。
可以看出,该操作来源方向表示发起该用户操作的用户指向电子设备的方向,即指示了该用户相对于电子设备的位置。进一步地,当用户操作具体为作用于中控屏或物理按键的操作,该操作来源方向表示发起该用户操作的用户指向中控屏或物理按键的方向,即指示了该用户相对于中控屏的位置。
当存在多个所在位置不同的用户向电子设备发起用户操作时,该操作来源方向不同,例如存在第一用户发起第一操作时的操作来源方向为第一来源方向,第二用户发起第二操作是的操作来源方向为第二来源方向,第一来源方向指示了第一用户相对于电子设备的位置,第二来源方向指示了第二用户相对于电子设备的位置,当第一用户与第二用户相对于电子设备的位置不同时,例如,第一用户位于电子设备的左边,第二用户位于电子设备的右边,则第一来源方向和第二来源方向不同,第一来源方向表示用户操作来源于电子设备的左边,第二来源方向表示用户操作来源于电子设备的右边。
其中,电子设备确定操作来源方向的方式可以包括以下两种:
1)根据用户发起用户操作时,作用在触摸屏上的触摸面,来确定操作来源方向
这种情况下,该用户操作为作用于电子设备的触摸屏的触摸操作。具体地,电子设备可以根据该触摸面下的触摸形状、指纹图案、压力大小等等信息来确定用户的手指方向,进而确定该操作来源方向,该手指方向可以是指手指指尖指向的方向。
示例性地,图3示出了用户向电子设备发起触摸操作的示意图。
图3示出了用户位于电子设备的左前方时,作用在电子设备上的一个触摸面A,以及,用户位于电子设备的右前方时,作用在电子设备上的另一个触摸面B。其中,触摸面A指示的手指方向为右上方,触摸面B指示的手指方向为左上方。
可以看出,在用户不是位于该触摸屏的正前方,例如位于该触摸屏的左前方或右前方时,用户作用在电子设备上的触摸面就存在一定的偏转角,该偏转角可以看做是由于用户未正对电子设备时,导致的用户从侧边伸手操控电子设备,用户手指与水平面的夹角。
示例性地,结合图1所示的场景图,从图4可以看出,当手指方向与水平面的夹角位于0°-90°之间时,可以确定该触摸操作是位于电子设备的左侧的用户,即驾驶员发起的操作,当手指方向与水平面的夹角位于90°-180°之间时,可以确定该触摸操作是位于电子设备的右侧的用户,即副驾乘客发起的操作。
也就是说,电子设备可以根据手指方向与水平面的夹角来确定操作来源方向。示例性地,当手指方向与水平面的夹角位于第一范围内时,电子设备确定操作来源方向为第一方向,当手指方向与水平面的夹角为第二范围时,电子设备确定操作来源方向为第二方向。其中,第一范围不同于第二范围,第一方向不同于第二方向,例如,该第一范围可以为0°-90°,该第二范围可以为90°-180°,该第一方向为电子设备的右方,该第二方向为电子设备的左方。
另外,该第一范围和第二范围可以为固定的,也可以为变化的,当第一范围和第二范围固定时,第一范围和第二范围可以为预设的范围。当第一范围和第二范围为变化时,电子设备可以根据用户的操作不断的自学习,调整第一范围和第二范围。这是由于不同用户的身高体型以及操作习惯的不同,可能会导致用户发起用户操作时,手指方向与水平面的夹角会存在一定的变化,以预设的范围来确定操作来源方向可能会不够准确,因此需要根据一些因素,例如用户的身高、体重、位置、手臂长度等等,来动态调整第一范围和第二范围,使得电子设备对用户的身份角色的确定更加准确。
其中,当用户操作为触摸操作时,电子设备可以通过以下三种方式来确定用户的手指方向:
a)根据触摸面的形状来确定手指方向
图5示出了触摸面的形状示意图。从图5可以看出,手指作用在触摸屏上时,从手指指尖到手指指腹方向,触摸面的横向宽度呈现逐渐变宽的趋势。即触摸面的形状通常表现为如图4所示的上窄下宽的椭圆形。
示例性地,电子设备可以根据触摸面上窄下宽的特点,将宽的一侧指向窄的一侧的方向确定为手指方向。
可以理解的是,不限于根据触摸面上窄下宽的特点来确定手指方向。例如,可以根据触摸面的边界曲线的曲率大小来确定手指方向,其中,曲线的曲率越大,则说明曲线的弯曲程度越大,曲线的曲率越小,则说明曲线的弯曲程度越小,可以将曲率最小的点,指向曲率最大的点的方向,确定为手指方向。本申请实施例对根据触摸面的形状来确定手指方向时,所采用的具体实现方式不作限制。
b)根据触摸面的指纹图案来确定手指方向
由于手指指纹自带有一定的方向性,通过指纹图案也可以确定出手指方向。
图6为一种指纹图案的示意图。如图6所示,指纹图案中趋近于直线的纹路可以被称为根基线,指纹纹路的中心可以被称为中心点。示例性地,可以将根基线所在的一边指向中心点的方向确定为手指方向。
可以理解的是,利用指纹图案中的根基线和中心点来确定手指方向的方法只是一种示例,不构成对本申请实施例的限制,在本申请其他实施例中,还可以利用其他通过指纹图案来确定手指方向的方法,这里不再赘述。
c)根据触摸面的压力变化来确定手指方向
由于用户在触摸到显示屏上时,一般从指间到指腹方向,按压压力逐渐递减。
图7示出了一种用户发起触摸操作,按压显示屏的示意图。如图7所示,可以将压力递增的方向确定为手指方向。
可以理解的是,当用户操作为触摸操作时,不限于上述三种方式来确定手指方向,示例性地,可以结合上述多种方式来确定手指方向,例如,结合触摸面的形状以及指纹图案来确定手指方向。这样,可以确定出更加准确的手指方向。本申请实施例对确定手指方向的方式不作限制。
2)根据摄像头采集的用户肢体的朝向,来确定操作来源方向
当用户从不同方向发起对电子设备的操作时,摄像头可以采集到用户从不同方向伸向电子设备的肢体部分,该肢体部分可以为用户的手臂、手掌等等部分。这样,电子设备可以根据该摄像头采集的图像来确定操作来源方向。其中,该摄像头可以为电子设备自带的摄像头,该摄像头可以为挖孔摄像头、屏下摄像头等等,本申请实施例对该摄像头的类型以及位置不作限制。或者,该摄像头可以为其他电子设备的摄像头,电子设备可以在检测到用户操作时,通过其他电子设备的摄像头采集图像。或者,该摄像头可以为一个独立的摄像头,电子设备在检测到用户操作时,可以向该摄像头发送拍摄指令,控制该摄像头触发拍照。
具体地,电子设备可以在检测到用户操作时,控制摄像头触发拍照,获取包含用户肢体部分的图像,通过识别图像中肢体部分的方向,来判断用户发起用户操作时的操作来源方向。进一步地,电子设备还可以通过该图像判断用户操作指向的具体位置。这样,当同时存在多个用户发起操作时,电子设备可以根据不同操作作用的不同位置,来识别接收到的多个操作具体为多个用户中的哪一个用户发起的操作。
例如,假设该摄像头为电子设备的触摸屏下的屏下摄像头,电子设备的触摸屏上可以显示有多个应用的应用图标。当电子设备通过该摄像头采集到两个用户发起对触摸屏的触摸操作,其中,位于触摸屏左侧的用户1发起的触摸操作作用在触摸屏的左上角,位于触摸屏的右侧的用户2发起的触摸操作作用在触摸屏的右上角,则电子设备采集到的图像中,存在一个从左侧指向左上角的手部局部图,以及,一个从右侧指向右上角的手部局部图。电子设备可以根据图像中用户1的肢体朝向,以及用户2的肢体朝向,来确定是启动用户1点击的触摸屏中,左上角显示的应用图标对应的应用,还是启动用户2点击的触摸屏中,右上角显示的应用图标对应的应用。
示例性地,结合图1所示的场景图,图8示出了电子设备(即中控设备)采集的两个用户发起对中控屏的操作时,包含这两个用户的肢体部位的图像。从图7可以看出,该图像中采集有两个手势,手势1来源于电子设备的左侧,该手势1指示了做出该手势的用户的手指指向左上角,手势2来源于电子设备的右侧,该手势2指示了做出该手势的用户的手指指向右上角。结合图1,手势1为驾驶员发起的用户操作,手势2位副驾乘客发起的用户操作。
需要注意的是,当电子设备根据摄像头采集的用户肢体的朝向,来确定操作来源方向时,该用户操作可以为触摸操作,也可以是指作用于按键的按压操作。电子设备可以在检测到用户操作时,触发摄像头采集图像。
可以理解的是,不限于上述提及的根据触摸面或用户肢体的朝向来确定操作来源方向,示例性地,电子设备还可以根据驾驶员的手是否离开方向盘,来判断操作来源方向,例如,当电子设备在检测到用户操作时,通过方向盘上的传感器检测到驾驶员的右手离开了方向盘,则电子设备确定操作来源方向为左侧,或者,电子设备还可以根据接近光传感器对附近物体的检测,来判断操作来源方向,例如,当电子设备在 检测到用户操作时,通过接近光传感器检测到电子设备的左前方有物体,则电子设备可以确定该操作来源方向为左侧。另外,电子设备还可以结合用户发起用户操作时,作用在触摸屏上的触摸面,以及摄像头采集的用户肢体的朝向,来确定操作来源方向,使电子设备确定的操作来源方向更加准确,本申请实施例对电子设备根据用户操作确定操作来源方向的方式不作限制。
S103.电子设备根据操作来源方向确定该用户的身份角色。
由于操作来源方向指示了发起该用户操作的用户相对于电子设备的方向,因此,电子设备可以根据用户相对于电子设备的位置关系,来确定该用户的身份角色。
操作来源方向不同时,用户的身份角色不同。换句话说,用户所处的位置不同时,用户的身份角色不同。例如,在车载场景下,用户的身份角色可以包括:驾驶员、副驾乘客,当用户位于驾驶位,即位于电子设备的左边时,用户的身份角色为驾驶员,当用户位于副驾驶位,即位于电子设备的右边时,用户的身份角色为副驾乘客。
可以理解的是,用户的身份角色不限于上述提及的驾驶员和副驾乘客,操作来源方向与用户的身份角色也不存在绝对的绑定关系,例如,操作来源方向为左边时,用户的身份角色不一定是驾驶员,操作来源方向为右边时,用户的身份角色不一定是副驾乘客,具体应与车内驾驶员和副驾乘客的相对位置进行确定。比如,部分国家或地区规定驾驶员的位置位于右边,部分国家或地区规定驾驶员的位置位于左边,也就是说,在根据操作来源方向确定用户的身份角色时,可以结合不同国家或地区的道路行驶规则来进行确定。
另外,在一些实施例中,电子设备也可以不执行步骤S103,这样,在电子设备根据隔离策略确定是否执行第一操作的响应时,可以根据该第一操作的操作来源方向是否为指定方向,来确定是否执行第一操作的响应,该指定方向可以为某一范围内的方向,例如,用户手指方向与水平面成0°-90°之间时,手指的方向。
S104.电子设备获取车辆状态。
车辆状态可以包括行驶状态、静止状态这两种状态。电子设备可以根据该车辆状态来分析是否执行该用户操作的响应。
示例性地,电子设备可以根据车辆的速度、位置等等信息来确定车辆状态。例如,当车辆的速度大于零时,确定车辆状态为行驶状态,当车辆的速度等于零时,确定车辆状态为静止状态。又例如,当车辆的位置在实时变化时,则确定车辆状态为行驶状态,当车辆的位置未发生改变时,则确定车辆状态为静止状态。
可以理解的是,车辆状态不限于上述提及的行驶状态和静止状态,例如,该可以包括自动驾驶状态、非自动驾驶状态等等,电子设备还可以根据其他方式确定车辆状态,例如根据车内各设备的工作情况、车窗是否开启、油量、电量等等信息来确定车辆状态。本申请实施例对电子设备获取车辆状态的方式不做限制。
另外,步骤S104为可选的步骤,电子设备也可以不获取车辆状态。
S105.电子设备确定第一应用的应用类型。
电子设备可以根据第一应用的应用类型来分析是否执行该用户操作的响应。
在车载场景中,应用类型可以包括:驾驶辅助类、车辆控制类、生活娱乐类等等,其中,驾驶辅助类应用可以包括:导航应用、车辆位置显示应用、倒车影像应用等等,车辆控制类应用可以包括:空调设置应用、车门控制应用、车窗控制应用、驾驶模式切换应用、档位切换应用等等,生活娱乐类应用可以包括:音视频播放应用、游戏应用、电话应用等等。
在一些实施例中,电子设备除了确定第一应用的应用类型外,还可以确定第一应用是否为指定应用。也就是说,电子设备除了根据第一应用是否为指定类型的应用外,还可以根据第一应用是否为指定应用来分析是否执行该用户操作的响应。
需要注意的是,步骤S105为可选的步骤,电子设备也可以不确定第一应用的应用类型。
S106.电子设备根据隔离策略判断第一应用是否执行该用户操作的响应。
隔离策略可以包括用户的身份角色为指定身份角色。具体地,当电子设备确定用户的身份角色为指定身份角色时,电子设备可以控制第一应用执行该用户操作的响应,否则,拒绝执行该用户操作的响应。换 句话说,由于用户相对于电子设备的位置不同时,用户的身份角色不同,用户的身份角色为指定身份角色还可以是指用户所在的位置为指定位置,例如位于第一位置而不是第二位置。
具体地,电子设备可以检测到多个用户发起的用户操作,例如第一用户发起的第一操作,第二用户发起的第二操作,该第一用户向电子设备发起该第一操作时的操作来源方向,与,第二用户向电子设备发起该第二操作时的操作来源方向不同。例如,假设第一用户位于电子设备的左边、第二用户位于电子设备的右边,第一用户发起第一操作时的操作来源方向即为电子设备的左边,第二用户发起第二操作时的操作来源方向即为电子设备的右边。当操作来源方向为左边的用户为指定身份角色的用户,则电子设备执行该第一操作的响应,拒绝执行第二操作的响应。
可以理解的是,该第一操作和第二操作可以为同时向电子设备发起的操作,可以为不同时向电子设备发起的操作。
进一步地,该隔离策略还可以包括:车辆状态为指定状态(例如第一状态),和/或,应用类型为指定类型。具体地,当电子设备确定用户的身份角色为指定身份角色,且车辆状态为指定状态时,控制第一应用执行该用户操作的响应,否则,拒绝执行该用户操作的响应,或者,当电子设备确定用户的身份角色为指定身份角色,且应用类型为指定类型时,控制第一应用执行该用户操作的响应,否则,拒绝执行该用户操作的响应,或者,当电子设备确定用户的身份角色为指定身份角色,且车辆状态为指定状态,且应用类型为指定类型时,控制第一应用执行该用户操作的响应,否则,拒绝执行该用户操作的响应。
应理解,隔离策略可以包括用户的身份角色不是指定身份角色,车辆状态不是指定状态,应用类型不是指定类型。例如,电子设备也可以在确定用户的身份角色不是指定身份角色时,控制第一应用执行该用户操作的响应,否则,拒绝执行该用户操作的响应,车辆状态与应用类型类似,这里不再赘述。
也即是说,电子设备可以根据用户的身份角色、车辆状态、应用类型中的一项或多项是否达到预设要求来确定用户是否能够使用该第一应用。
示例性地,电子设备可以仅根据用户的身份角色来判断第一应用是否执行该用户操作的响应。例如,当用户的身份角色为副驾乘客时,则电子设备确定第一应用不执行该用户操作的响应。这样,可以达到位于副驾驶座的用户无法使用该电子设备的效果。
示例性地,电子设备可以根据用户的身份角色以及应用类型来判断第一应用是否执行该用户操作的响应。例如,当用户的身份角色为副驾乘客,该第一应用为车辆控制类应用时,则电子设备确定第一应用不执行该用户操作的响应。这样,可以达到位于副驾驶座的用户无法使用车辆控制类应用的效果,避免副驾乘客通过该电子设备干扰驾驶员控制车辆。又例如,当用户的身份角色为驾驶员,该第一应用为驾驶辅助类应用,则电子设备确定用户可以使用该第一应用。这样,驾驶员可以通过使用电子设备中的驾驶辅助类应用,更好地实现对车辆的控制。
示例性地,电子设备可以根据用户的身份角色、车辆状态以及应用类型来判断第一应用是否执行该用户操作的响应。例如,当用户的身份角色为驾驶员,该第一应用为生活娱乐类应用,车辆状态为行驶状态时,则电子设备确定第一应用不执行该用户操作的响应。这样,可以达到车辆行驶过程中,位于驾驶座的用户无法使用生活娱乐类应用的效果,避免驾驶员分散注意力造成驾驶事故。
应理解,当电子设备仅根据用户的身份角色来判断第一应用是否执行该用户操作的响应时,则电子设备不执行步骤S104和步骤S105,同理,当电子设备仅根据用户的身份角色、车辆状态来判断第一应用是否执行该用户操作的响应时,则电子设备不执行步骤S105,当电子设备仅根据用户的身份角色、第一应用的应用类型来判断第一应用是否执行该用户操作的响应时,则电子设备不执行步骤S104。
另外,该隔离策略可以为开发人员预先导入电子设备的策略,或者,也可以为使用该电子设备的用户,自行设置的策略。进一步地,在电子设备通过该隔离策略来控制应用对不同用户的响应情况时,用户也可以更新或调整该隔离策略。本申请实施例对该隔离策略的来源不做限制。
当电子设备根据隔离策略确定第一应用执行该用户操作的响应,则电子设备执行步骤S107,否则,电子设备执行步骤S108。
S107.电子设备控制第一应用执行该用户操作的响应。
电子设备控制第一应用执行该用户操作的响应,具体是指电子设备控制第一应用执行第一应用的第一功能,或者,电子设备显示第一应用的第一用户界面等等。例如,当第一应用为导航应用时,该用户操作可以为启动导航应用的操作,电子设备执行该用户操作的响应可以是指电子设备显示该导航应用提供的用户界面,或者,该用户操作可以是指启动导航应用的导航功能的操作,电子设备执行该用户操作的响应可 以是指电子设备启动导航功能。
可以看出,电子设备根据隔离策略控制第一应用执行该用户操作的响应,则说明该用户操作符合要求,由允许操控电子设备的用户发起的操作,电子设备可以响应于该用户操作,执行相应的响应。在车载场景下,控制电子设备仅响应于指定位置上的用户发起的操作,可以避免位于其他位置上的用户干扰该指定位置上的用户发起的操作,从而保证车辆的行驶安全,提供更加智能化的车载服务。
S108.电子设备控制第一应用拒绝响应于该用户操作。
电子设备控制第一应用拒绝响应于该用户操作,具体是指电子设备不执行该用户的操作的响应。例如,当该用户操作为启动第一应用的操作时,电子设备拒绝响应于该用户操作是指电子设备不启动第一应用,或者,当该用户操作为启动第一应用的第一功能的操作时,电子设备拒绝响应于该用户操作是指电子设备不启动第一功能。
可以看出,电子设备根据隔离策略控制第一应用不执行该用户操作的响应,则说明该用户操作不符合要求,或者由禁止操控该电子设备的用户发起的操作,电子设备可以拦截该用户操作,拒绝响应于该用户操作。在车载场景下,控制电子设备拦截指定位置上的用户发起的操作,可以保证指定位置上的用户无法操控电子设备,避免指定位置上的用户干扰车辆的行驶,从而保障车辆的行驶安全。
在一些实施例中,在同一时间,电子设备可以检测到多个用户发起的用户操作,则电子设备可以分别确定这多个用户的用户操作的操作来源方向,确定这多个用户的身份角色,并分别根据用户的身份角色来确定是否响应于该用户的操作,进一步地,电子设备还可以确定车辆状态,和/或,这多个用户操作作用的应用的应用类型,并结合车辆状态,和/或,应用类型来确定是否响应于该用户的操作。例如,电子设备可以检测到第一用户发起的作用于第一应用的第一操作,以及第二用户发起的作用于第二应用的第二操作,电子设备分别根据第一操作和第二操作的操作来源方向,确定第一用户和第二用户的身份角色,根据该第一用户的身份角色,确定第一应用执行第一操作的响应,根据第二用户的身份角色,确定第二应用不执行第二操作的响应。这样,电子设备可以同时检测到多个用户的操作,并分别判断是否执行这多个操作的响应,例如,仅响应于一个用户的操作,或者,同时或先后响应于这多个用户的操作,或者,拒绝响应于这多个用户的操作等等。
总的来说,本申请实施例提供的方法能够根据用户操控电子设备时,该用户所在的位置,来区分位于不同位置的用户发起的操作,并且,该方法根据用户所在的位置来识别用户的身份角色,不受用户换座的影响,对用户的身份角色的识别更具有灵活性,可靠性高。
另外,需要注意的是,本申请实施例提供的方法不限于应用在车载场景中,应理解,该方法可以应用在任何存在多个不同位置的用户操控电子设备,并需要根据用户的位置来划分用户操控电子设备的权限的场景,本申请实施例对此不作限制。
图9示出了电子设备100的硬件结构示意图。
电子设备100可以是手机、平板电脑、桌面型计算机、膝上型计算机、手持计算机、笔记本电脑、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本,以及蜂窝电话、个人数字助理(personal digital assistant,PDA)、增强现实(augmented reality,AR)设备、虚拟现实(virtual reality,VR)设备、人工智能(artificial intelligence,AI)设备、可穿戴式设备、车载设备、智能家居设备和/或智慧城市设备,本申请实施例对该电子设备的具体类型不作特殊限制。该电子设备100可以为前述提及的电子设备。
电子设备100可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,充电管理模块140,电源管理模块141,电池142,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194,以及用户标识模块(subscriber identification module,SIM)卡接口195等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,气压传感器180C,磁传感器180D,加速度传感器180E,距离传感器180F,接近光传感器180G,指纹传感器180H,温度传感器180J,触摸传感器180K,环境光传感器180L,骨传导传感器180M,车速传感器180N等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备100的具体限定。在本申请另一些实施例中,电子设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。
充电管理模块140用于从充电器接收充电输入。
电源管理模块141用于连接电池142,充电管理模块140与处理器110。电源管理模块141接收电池142和/或充电管理模块140的输入,为处理器110,内部存储器121,显示屏194,摄像头193,和无线通信模块160等供电。电源管理模块141还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。
电子设备100的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备100中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备100上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号解调以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备100的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
电子设备100通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。在一些实施例中,电子设备可以包括1个或N个显示屏194,N为大于1的正整数。
在一些实施例中,显示屏194可用于显示用户界面,例如第一应用的用户界面等等,示例性地,该车载场景中,该显示屏194可以是指中控设备的中控屏,该中控屏可以仅提供显示功能,或者,该中控屏还可以检测到用户的触摸操作,这时,该中控屏为一个触摸屏。
电子设备100可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。
摄像头193用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备100可以包括1个或N个摄像头193,N为大于1的正整数。
在一些实施例中,电子设备100可以通过摄像头193采集用户发起用户操作时,用户的肢体部位,电子设备100可以根据用户的肢体部位的朝向来确定操作来源方向。具体关于电子设备100根据摄像头采集的用户肢体的朝向来确定操作来源方向的描述可以参见前述步骤S102的相关内容,这里不再赘述。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备100在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备100可以支持一种或多种视频编解码器。这样,电子设备100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
内部存储器121可以包括一个或多个随机存取存储器(random access memory,RAM)和一个或多个非易失性存储器(non-volatile memory,NVM)。
随机存取存储器可以由处理器110直接进行读写,可以用于存储操作系统或其他正在运行中的程序的可执行程序(例如机器指令),还可以用于存储用户及应用程序的数据等。
非易失性存储器也可以存储可执行程序和存储用户及应用程序的数据等,可以提前加载到随机存取存储器中,用于处理器110直接进行读写。
外部存储器接口120可以用于连接外部的非易失性存储器,实现扩展电子设备100的存储能力。外部的非易失性存储器通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部的非易失性存储器中。
电子设备100可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备100可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备100接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备100可以设置至少一个麦克风170C。
耳机接口170D用于连接有线耳机。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力 传感器180A,电极之间的电容改变。电子设备100根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备100根据压力传感器180A检测所述触摸操作强度。电子设备100也可以根据压力传感器180A的检测信号计算触摸的位置。
在一些实施例中,电子设备100可以根据压力传感器180A来确定用户发起触摸操作时,该触摸操作作用在电子设备100上的触摸面的压力强度的变化。具体地,电子设备100可以将触摸面上,压力的强度逐渐递增的方向确定为手指方向。具体关于电子设备100根据压力的变化确定手指方向的描述可以参见前述步骤S102的相关内容,这里不再赘述。
陀螺仪传感器180B可以用于确定电子设备100的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备100围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
气压传感器180C用于测量气压。在一些实施例中,电子设备100通过气压传感器180C测得的气压值计算海拔高度,辅助定位和导航。
磁传感器180D包括霍尔传感器。电子设备100可以利用磁传感器180D检测翻盖皮套的开合。
加速度传感器180E可检测电子设备100在各个方向上(一般为三轴)加速度的大小。当电子设备100静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备100可以利用距离传感器180F测距以实现快速对焦。
接近光传感器180G可以包括例如发光二极管(LED)和光检测器,例如光电二极管。发光二极管可以是红外发光二极管。电子设备100通过发光二极管向外发射红外光。电子设备100使用光电二极管检测来自附近物体的红外反射光。当检测到充分的反射光时,可以确定电子设备100附近有物体。当检测到不充分的反射光时,电子设备100可以确定电子设备100附近没有物体。电子设备100可以利用接近光传感器180G检测用户手持电子设备100贴近耳朵通话,以便自动熄灭屏幕达到省电的目的。接近光传感器180G也可用于皮套模式,口袋模式自动解锁与锁屏。
在一些实施例中,电子设备100可以通过接近光传感器180G来判断用户发起用户操作时的操作来源方向,例如,当用户操作来源于电子设备100的左侧时,电子设备100可以通过接近光传感器180G检测到电子设备100的左前方存在物体,当用户操作来源于电子设备100的右侧时,电子设备100可以通过接近光传感器180G检测到电子设备100的右前方存在物体。
环境光传感器180L用于感知环境光亮度。电子设备100可以根据感知的环境光亮度自适应调节显示屏194亮度。环境光传感器180L也可用于拍照时自动调节白平衡。环境光传感器180L还可以与接近光传感器180G配合,检测电子设备100是否在口袋里,以防误触。
指纹传感器180H用于采集指纹。电子设备100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照,指纹接听来电等。
在一些实施例中,电子设备100可以通过指纹传感器180H采集用户发起用户操作时的指纹图案,进而根据该指纹图案确定手指方向。具体关于电子设备100根据指纹图案确定手指方向的描述可以参见前述步骤S102的相关内容,这里不再赘述。
温度传感器180J用于检测温度。在一些实施例中,电子设备100利用温度传感器180J检测的温度,执行温度处理策略。例如,当温度传感器180J上报的温度超过阈值,电子设备100执行降低位于温度传感器180J附近的处理器的性能,以便降低功耗实施热保护。在另一些实施例中,当温度低于另一阈值时,电子设备100对电池142加热,以避免低温导致电子设备100异常关机。在其他一些实施例中,当温度低于又一阈值时,电子设备100对电池142的输出电压执行升压,以避免低温导致的异常关机。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备100的表面,与显示屏194所处的位置不同。
在一些实施例中,电子设备100可以通过触摸传感器180K检测到作用于第一应用的触摸操作。
骨传导传感器180M可以获取振动信号。在一些实施例中,骨传导传感器180M可以获取人体声部振动骨块的振动信号。骨传导传感器180M也可以接触人体脉搏,接收血压跳动信号。在一些实施例中,骨传导传感器180M也可以设置于耳机中,结合成骨传导耳机。音频模块170可以基于所述骨传导传感器 180M获取的声部振动骨块的振动信号,解析出语音信号,实现语音功能。应用处理器可以基于所述骨传导传感器180M获取的血压跳动信号解析心率信息,实现心率检测功能。
车速传感器180N可以获取车辆的行驶速度。在一些实施例中,电子设备100可以根据车速传感器180N获取的行驶速度来确定车辆状态。具体关于电子设备100获取车辆状态的描述可以参见前述步骤S104的相关内容,这里不再赘述。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备100可以接收按键输入,产生与电子设备100的用户设置以及功能控制有关的键信号输入。在一些实施例中,电子设备100可以通过按键190检测到用户作用于第一应用的用户操作。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口195用于连接SIM卡。电子设备100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,电子设备100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在电子设备100中,不能和电子设备100分离。
电子设备可以是车辆中固定的中控设备,可以是指包含有中控屏,或者,进一步包含有物理按键的车辆。电子设备可以是搭载iOS、Android、Microsoft或者其它操作系统的便携式终端设备,例如手机、平板电脑、可穿戴设备等,还可以是具有触敏表面或触控面板的膝上型计算机(Laptop)、具有触敏表面或触控面板的台式计算机等非便携式终端设备。电子设备100的软件系统可以采用分层架构,事件驱动架构,微核架构,微服务架构,或云架构。
图10是本申请实施例的电子设备100的软件结构示意图。
分层架构将软件分成若干个层,每一层都有清晰的角色和分工。层与层之间通过软件接口通信。在一些实施例中,将移动操作系统分为四层,从上至下分别为应用程序层,程序框架层/核心服务层,底层库和运行时,以及内核层。
应用程序层可以包括一系列应用程序包。
如图10所示,应用程序包可以包括驾驶辅助类应用、车辆控制类应用、生活娱乐类应用等应用程序。
程序框架层为应用程序层的应用程序提供应用编程接口(application programming interface,API)和编程框架。程序框架层包括一些预先定义的函数。
如图10所示,程序框架层可以包括特征采集模块、角色识别模块、访问控制模块、场景识别模块等。
特征采集模块可用于根据用户操作确定操作来源方向,并将操作来源方向发送给角色识别模块。其中,该操作来源方向指示了发起该用户操作的用户相对于电子设备100的方向。具体关于电子设备100根据用户操作确定操作来源方向的描述可以参见前述步骤S102的相关内容,这里不再赘述。
角色识别模块可用于根据操作来源方向确定用户的身份角色,并将该身份角色发送给访问控制模块。具体关于电子设备100根据操作来源方向确定用户的身份角色的描述可以参见前述步骤S103的相关内容,这里不再赘述。
场景识别模块可用于确定车辆状态,并将车辆状态发送给访问控制模块。示例性地,该车辆状态包括行驶状态和静止状态,具体关于电子设备100确定车辆状态的描述可以参见前述步骤S104的相关内容,这里不再赘述。
访问控制模块可用于确定用户操作作用的应用(例如第一应用)的应用类型,并根据应用类型、用户的身份角色、车辆状态中的一项或多项判断该应用是否执行用户操作的响应。如果是,则访问控制模块将该用户操作发送给该应用,该用户操作可用于触发启动该应用,或启动该应用的第一功能等等,否则,访问控制模块不将该用户操作发送给该应用。示例性地,当用户发起的操作为作用于驾驶辅助类应用的操作,当访问控制模块确定该驾驶辅助类应用允许执行该操作的响应时,访问控制模块将该操作的指示信息发送给该驾驶辅助类应用。
在一些实施例中,程序框架层也可以不包括访问控制模块,在角色识别模块确定用户的身份角色,场景识别模块确定车辆状态后,可以将用户操作的指示信息,用户的身份角色以及车辆状态,发送给该用户操作作用的应用,由该应用根据用户的身份角色、车辆状态以及该应用的应用类型来判断是否执行该用户操作的响应。
应理解,当电子设备100仅根据用户的身份角色来判断是否执行该用户操作的响应时,该程序框架层 可以不包括场景识别模块。程序框架层中包含的功能模块只是示例性举例,不构成对本申请实施例的限制,在本申请其他实施例中,程序框架层可以包含更多或更少的功能模块,上述多个功能模块也可以任意拆分和组合,或者上述多个功能模块也可以设置于其他软件架构层。
运行时可以指程序运行时所需的一切代码库、框架等。例如,对于C语言来说,运行时包括一系列C程序运行所需的函数库。对于Java语言来说,除了核心库之外,运行时还包括Java程序运行所需的虚拟机等。上述核心库可包括Java语言需要调用的功能函数。
底层库可以包括多个功能模块。例如:表面管理器(surface manager),媒体库(Media Libraries),三维图形处理库(例如:OpenGL ES),2D图形引擎(例如:SGL)等。
表面管理器用于对显示子系统进行管理,并且为多个应用程序提供了2D和3D图层的融合。
媒体库支持多种常用的音频,视频格式回放和录制,以及静态图像文件等。媒体库可以支持多种音视频编码格式,例如:MPEG4,H.264,MP3,AAC,AMR,JPG,PNG等。
三维图形处理库用于实现三维图形绘图,图像渲染,合成,和图层处理等。
2D图形引擎是2D绘图的绘图引擎。
内核层是硬件和软件之间的层。内核层至少包含显示驱动,摄像头驱动,音频驱动,传感器驱动。
下面结合捕获拍照场景,示例性说明电子设备软件以及硬件的工作流程。
当触摸传感器180K接收到触摸操作,相应的硬件中断被发给内核层。内核层将触摸操作加工成原始输入事件(包括触摸坐标,触摸操作的时间戳等信息)。原始输入事件被存储在内核层。应用程序框架层从内核层获取原始输入事件,识别该输入事件所对应的控件。以该触摸操作是触摸单击操作,该单击操作所对应的控件为相机应用图标的控件为例,相机应用调用应用框架层的接口,启动相机应用,进而通过调用内核层启动摄像头驱动,通过摄像头193捕获静态图像或视频。
应理解,上述方法实施例中的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本申请实施例所公开的方法步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。
本申请还提供一种电子设备,该电子设备可以包括:存储器和处理器。其中,存储器可用于存储计算机程序;处理器可用于调用所述存储器中的计算机程序,以使得该电子设备执行上述任意一个实施例中电子设备100执行的方法。
本申请还提供了一种芯片系统,所述芯片系统包括至少一个处理器,用于实现上述任一个实施例中电子设备100执行的方法中所涉及的功能。
本申请还提供了一种车辆,所述车辆包括中控屏,存储器,一个欧多个处理器,以及一个或多个程序;所述一个或多个处理器用于实现上述实施例中电子设备100执行的方法。
本申请还提供了另一种车辆,所述车辆包括中控屏和物理按键,存储器,一个欧多个处理器,以及一个或多个程序;所述一个或多个处理器用于实现上述实施例中电子设备100执行的方法。
在一种可能的设计中,所述芯片系统还包括存储器,所述存储器用于保存程序指令和数据,存储器位于处理器之内或处理器之外。
该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
可选地,该芯片系统中的处理器可以为一个或多个。该处理器可以通过硬件实现也可以通过软件实现。当通过硬件实现时,该处理器可以是逻辑电路、集成电路等。当通过软件实现时,该处理器可以是一个通用处理器,通过读取存储器中存储的软件代码来实现。
可选地,该芯片系统中的存储器也可以为一个或多个。该存储器可以与处理器集成在一起,也可以和处理器分离设置,本申请实施例并不限定。示例性地,存储器可以是非瞬时性处理器,例如只读存储器ROM,其可以与处理器集成在同一块芯片上,也可以分别设置在不同的芯片上,本申请实施例对存储器的类型,以及存储器与处理器的设置方式不作具体限定。
示例性地,该芯片系统可以是现场可编程门阵列(field programmable gate array,FPGA),可以是专用集成芯片(application specific integrated circuit,ASIC),还可以是系统芯片(system on chip,SoC),还可以是中央处理器(central processor unit,CPU),还可以是网络处理器(network processor,NP),还可以是数字信号处理电路(digital signal processor,DSP),还可以是微控制器(micro controller unit,MCU),还可以是可编程控制器(programmable logic device,PLD)或其他集成芯片。
本申请还提供一种计算机程序产品,所述计算机程序产品包括:计算机程序(也可以称为代码,或指 令),当所述计算机程序被运行时,使得计算机执行上述任一个实施例中电子设备100任意一个执行的方法。
本申请还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序(也可以称为代码,或指令)。当所述计算机程序被运行时,使得计算机执行上述任一个实施例中电子设备100任意一个执行的方法。
应理解,本申请实施例中的处理器可以是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法实施例的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器可以是通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(AP 800plication specific integrated circuit,ASIC)、现场可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。结合本申请实施例所公开的方法的步骤可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。
另外,本申请实施例还提供一种装置。该装置具体可以是组件或模块,该装置可包括相连的一个或多个处理器和存储器。其中,存储器用于存储计算机程序。当该计算机程序被一个或多个处理器执行时,使得装置执行上述各方法实施例中的方法。
其中,本申请实施例提供的装置、计算机可读存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法。因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。
本申请的各实施方式可以任意进行组合,以实现不同的技术效果。
在上述实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本申请所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线)或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如固态硬盘(solid state disk,SSD))等。
本领域普通技术人员可以理解实现上述实施例方法中的全部或部分流程,该流程可以由计算机程序来指令相关的硬件完成,该程序可存储于计算机可读取存储介质中,该程序在执行时,可包括如上述各方法实施例的流程。而前述的存储介质包括:ROM或随机存储记忆体RAM、磁碟或者光盘等各种可存储程序代码的介质。
总之,以上所述仅为本发明技术方案的实施例而已,并非用于限定本发明的保护范围。凡根据本发明的揭露,所作的任何修改、等同替换、改进等,均应包含在本发明的保护范围之内。

Claims (13)

  1. 一种操作隔离方向,其特征在于,所述方法包括:
    电子设备检测到第一用户发起的第一操作,
    所述电子设备执行所述第一操作的响应;
    所述电子设备检测到第二用户发起的第二操作;
    所述电子设备拒绝执行所述第二操作的响应;
    其中,所述第一用户向所述电子设备发起所述第一操作时的操作来源方向,与,所述第二用户向所述电子设备发起所述第二操作时的操作来源方向,不同,所述操作来源方向指示了发起操作的用户相对于所述电子设备的位置。
  2. 根据权利要求1所述的方法,其特征在于,所述方法应用于车辆中的电子设备,所述电子设备包括中控屏,所述中控屏位于所述车辆内驾驶位和副驾驶位前方的操作台的中间,电子设备检测到第一用户发起的第一操作,具体包括:
    所述电子设备检测到第一用户发起的作用于所述中控屏或物理按键的第一操作,所述物理按键位于所述操作台的中间。
  3. 根据权利要求2所述的方法,其特征在于,所述电子设备检测到第一用户发起的作用于所述中控屏或物理按键的第一操作之后,所述方法还包括:
    所述电子设备确定第一来源方向,所述第一来源方向为所述第一用户发起所述第一操作时指向所述中控屏或所述物理按键的方向,所述第一来源方向指示了所述第一用户相对于所述中控屏的位置;
    所述电子设备确定所述第一用户相对于所述中控屏的位置是第一位置并且不是第二位置。
  4. 根据权利要求3所述的方法,其特征在于,所述电子设备确定第一来源方向,具体包括:
    所述电子设备根据所述第一用户发起所述第一操作时,手指的指向方向和/或摄像头采集的包含所述第一用户的肢体部位的图像,确定所述第一来源方向。
  5. 根据权利要求4所述的方法,其特征在于,所述手指的指向方向,由所述电子设备根据所述第一用户发起所述第一操作时,作用在所述中控屏上的触摸面的形状、所述触摸面下的指纹图案和所述触摸面下的按压压力变化中的一项或多项确定。
  6. 根据权利要求3-5任一项所述的方法,其特征在于,所述电子设备确定所述第一用户相对于中控屏的位置是第一位置并且不是第二位置之后,所述方法还包括:
    所述电子设备确定所述车辆的车辆状态为第一状态,和/或,所述第一操作作用的第一应用为指定应用或指定类型的应用。
  7. 根据权利要求6所述的方法,其特征在于,所述第一状态为静止状态或行驶状态,所述指定类型包括以下一项或多项:驾驶辅助类、车辆控制类、生活娱乐类。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,
    所述电子设备执行所述第一操作的响应,具体包括:
    所述电子设备启动第一应用或执行所述第一应用的第一功能。
  9. 根据权利要求8所述的方法,其特征在于,
    所述电子设备拒绝执行所述第二操作的响应,具体包括:
    所述电子设备拒绝启动所述第一应用或拒绝执行所述第一功能,或者,拒绝启动第二应用或拒绝执行所述第二应用的第二功能。
  10. 一种电子设备,其特征在于,包括存储器,一个或多个处理器,以及一个或多个程序;所述一个或多个处理器在执行所述一个或多个程序时,使得所述电子设备实现如权利要求1至9任一项所述的方法。
  11. 一种车辆,其特征在于,所述车辆包括如权利要求2至9任一项所述的中控屏,存储器,一个或多个处理器,以及一个或多个程序;所述一个或多个处理器在执行所述一个或多个程序时,使得所述车辆实现如权利要求2至9任一项所述的方法。
  12. 一种车辆,其特征在于,所述车辆包括如权利要求2至9任一项所述的中控屏和物理按键,存储器,一个或多个处理器,以及一个或多个程序;所述一个或多个处理器在执行所述一个或多个程序时,使得所述车辆实现如权利要求2至9任一项所述的方法。
  13. 一种计算机可读存储介质,包括指令,其特征在于,当所述指令在电子设备上运行时,使得所述电子设备执行如权利要求1至9任一项所述的方法。
PCT/CN2023/103260 2022-06-30 2023-06-28 操作隔离方法及相关装置 WO2024002174A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210764282.7 2022-06-30
CN202210764282.7A CN117369698A (zh) 2022-06-30 2022-06-30 操作隔离方法及相关装置

Publications (1)

Publication Number Publication Date
WO2024002174A1 true WO2024002174A1 (zh) 2024-01-04

Family

ID=89383114

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/103260 WO2024002174A1 (zh) 2022-06-30 2023-06-28 操作隔离方法及相关装置

Country Status (2)

Country Link
CN (1) CN117369698A (zh)
WO (1) WO2024002174A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11312053A (ja) * 1998-04-30 1999-11-09 Toyota Motor Corp 画面タッチ式入力装置
JP2013120434A (ja) * 2011-12-06 2013-06-17 Denso It Laboratory Inc 操作者識別装置及び方法並びに車載用ナビゲーション装置
CN104769601A (zh) * 2014-05-27 2015-07-08 华为技术有限公司 识别用户身份的方法及电子设备
CN109298830A (zh) * 2018-11-05 2019-02-01 广州小鹏汽车科技有限公司 一种车载中控大屏触控方法、装置及计算机可读存储介质

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11312053A (ja) * 1998-04-30 1999-11-09 Toyota Motor Corp 画面タッチ式入力装置
JP2013120434A (ja) * 2011-12-06 2013-06-17 Denso It Laboratory Inc 操作者識別装置及び方法並びに車載用ナビゲーション装置
CN104769601A (zh) * 2014-05-27 2015-07-08 华为技术有限公司 识别用户身份的方法及电子设备
CN109298830A (zh) * 2018-11-05 2019-02-01 广州小鹏汽车科技有限公司 一种车载中控大屏触控方法、装置及计算机可读存储介质

Also Published As

Publication number Publication date
CN117369698A (zh) 2024-01-09

Similar Documents

Publication Publication Date Title
US20220253144A1 (en) Shortcut Function Enabling Method and Electronic Device
KR102534354B1 (ko) 시스템 탐색 바 표시 제어 방법, 그래픽 사용자 인터페이스 및 전자 디바이스
US20230046708A1 (en) Application Interface Interaction Method, Electronic Device, and Computer-Readable Storage Medium
WO2021052214A1 (zh) 一种手势交互方法、装置及终端设备
JP7280005B2 (ja) 無線充電方法および電子デバイス
WO2021057343A1 (zh) 一种对电子设备的操作方法及电子设备
US11994918B2 (en) Electronic device control method and electronic device
US20230009389A1 (en) One-hand operation method and electronic device
US11899879B2 (en) Stylus detection method, system, and related apparatus for switching frequencies for detecting input signals
JP2022548910A (ja) 近距離通信方法及び電子デバイス
WO2021180089A1 (zh) 界面切换方法、装置和电子设备
US20220244846A1 (en) User Interface Display Method and Electronic Device
CN110633043A (zh) 一种分屏处理方法及终端设备
WO2020037469A1 (zh) 界面的显示方法及电子设备
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
KR20170100332A (ko) 화상 통화 방법 및 장치
CN114090102B (zh) 启动应用程序的方法、装置、电子设备和介质
EP4407421A1 (en) Device collaboration method and related apparatus
EP4283450A1 (en) Display method, electronic device, storage medium, and program product
CN113391775A (zh) 一种人机交互方法及设备
CN114666433B (zh) 一种终端设备中啸叫处理方法及装置、终端
US11775111B2 (en) Method for controlling touchscreen and electronic device
CN115390737A (zh) 一种电子设备控制方法及电子设备
US20220317841A1 (en) Screenshot Method and Related Device
WO2024002174A1 (zh) 操作隔离方法及相关装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23830345

Country of ref document: EP

Kind code of ref document: A1