WO2024066977A1 - 基于掌部的人机交互方法、装置、设备、介质及程序产品 - Google Patents

基于掌部的人机交互方法、装置、设备、介质及程序产品 Download PDF

Info

Publication number
WO2024066977A1
WO2024066977A1 PCT/CN2023/117199 CN2023117199W WO2024066977A1 WO 2024066977 A1 WO2024066977 A1 WO 2024066977A1 CN 2023117199 W CN2023117199 W CN 2023117199W WO 2024066977 A1 WO2024066977 A1 WO 2024066977A1
Authority
WO
WIPO (PCT)
Prior art keywords
palm
action
interaction
proximity sensors
time period
Prior art date
Application number
PCT/CN2023/117199
Other languages
English (en)
French (fr)
Inventor
吕中方
郭润增
黄家宇
李胤恺
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2024066977A1 publication Critical patent/WO2024066977A1/zh

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition

Definitions

  • the embodiments of the present application relate to the field of computer technology, and in particular to a palm-based human-computer interaction method, device, equipment, medium and program product.
  • Some terminals are provided with a touch screen, and the terminal collects the user's triggering operation through the touch screen and transmits it to the server through the network, thereby realizing human-computer interaction.
  • the present application provides a palm-based human-computer interaction method, device, equipment, medium and program product, and the technical solution is as follows:
  • a palm-based human-computer interaction method is provided, the method being applied to a palm interaction device, the method comprising:
  • Acquire sensor data of the palm collected by at least two proximity sensors wherein the at least two proximity sensors are arranged in an array on the palm interaction device;
  • a palm interaction device comprising:
  • An acquisition module used to acquire sensor data of the palm collected by at least two proximity sensors, wherein the at least two proximity sensors are arranged in an array on the palm interaction device;
  • an identification module configured to identify a suspended interaction action of the palm through the sensor data of the palm collected by the at least two proximity sensors
  • the control module is used to execute the response operation of the suspended interaction action.
  • a palm interaction device which includes: a processor, a memory and at least two proximity sensors, the at least two proximity sensors collect sensor data of the palm, and store the above-mentioned palm sensor data in the memory; at least one computer program is stored in the memory, and the at least one computer program is loaded and executed by the processor to implement the palm-based human-computer interaction method described above.
  • a computer storage medium in which at least one computer program is stored.
  • the at least one computer program is loaded and executed by a processor to implement the palm-based human-computer interaction method as described above.
  • a computer program product which includes a computer program stored in a computer-readable storage medium; the computer program is read and executed from the computer-readable storage medium by a processor of a palm interaction device, so that the palm interaction device performs the palm-based human-computer interaction method described above.
  • At least two proximity sensors are arranged in an array on the palm interaction device.
  • the palm sensor data is collected by the at least two proximity sensors.
  • the palm's suspended interaction action can be identified based on the sensor data, and a response operation of the suspended interaction action is performed.
  • the user can control the palm interaction device without touching or any physical contact, providing a user-friendly interface. A new way of interaction between users and devices can improve the efficiency of interaction between users and devices.
  • FIG1 is a schematic diagram of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • FIG2 is a schematic diagram of the architecture of a computer system provided by an exemplary embodiment of the present application.
  • FIG3 is a flow chart of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • FIG4 is a flow chart of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • FIG5 is a schematic diagram of a palm interaction device provided by an exemplary embodiment of the present application.
  • FIG6 is a schematic diagram of a switching operation corresponding to an offset sweeping action provided by an exemplary embodiment of the present application.
  • FIG7 is a schematic diagram of a translation operation corresponding to an offset sweeping action provided by an exemplary embodiment of the present application.
  • FIG8 is a schematic diagram of an operation of adjusting a display ratio corresponding to an offset sweeping action provided by an exemplary embodiment of the present application
  • FIG9 is a schematic diagram of a determination operation corresponding to a long-short tapping action provided by an exemplary embodiment of the present application.
  • FIG10 is a schematic diagram of an exit operation corresponding to a long-short tapping action provided by an exemplary embodiment of the present application.
  • FIG11 is a schematic diagram of the quantity increase and decrease operations corresponding to the long and short tapping actions provided by an exemplary embodiment of the present application.
  • FIG12 is a schematic diagram of a palm interaction device entering an interaction mode provided by an exemplary embodiment of the present application.
  • FIG13 is a schematic diagram of finger gap points in the palm provided by an exemplary embodiment of the present application.
  • FIG14 is a schematic diagram of a cross-device mid-air interaction of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • FIG15 is a block diagram of a palm interaction device provided by an exemplary embodiment of the present application.
  • FIG16 is a schematic diagram of the structure of a palm interaction device provided by an exemplary embodiment of the present application.
  • FIG. 17 is a schematic diagram of the arrangement of proximity sensors provided by an exemplary embodiment of the present application.
  • FIG. 18 is a schematic diagram of the arrangement of proximity sensors provided by an exemplary embodiment of the present application.
  • Artificial Intelligence is the theory, method, technology and application system that uses digital computers or machines controlled by digital computers to simulate, extend and expand human intelligence, perceive the environment, acquire knowledge and use knowledge to obtain the best results.
  • artificial intelligence is a comprehensive technology in computer science that attempts to understand the essence of intelligence and produce a new intelligent machine that can respond in a similar way to human intelligence.
  • Artificial intelligence is to study the design principles and implementation methods of various intelligent machines, so that machines have the functions of perception, reasoning and decision-making.
  • Artificial intelligence technology is a comprehensive discipline that covers a wide range of fields, including both hardware-level and software-level technologies.
  • Basic artificial intelligence technologies generally include sensors, dedicated artificial intelligence chips, cloud computing, distributed storage, big data processing technology, operating/interactive systems, mechatronics, and other technologies.
  • Artificial intelligence software technologies mainly include computer vision technology, speech processing technology, natural language processing technology, and machine learning/deep learning.
  • Cloud technology refers to a hosting technology that unifies hardware, software, network and other resources within a wide area network or local area network to achieve data computing, storage, processing and sharing.
  • Cloud technology is a general term for network technology, information technology, integration technology, management platform technology, application technology, etc. based on the cloud computing business model. It can form a resource pool, which can be used on demand and is flexible and convenient. Cloud computing technology will become an important support.
  • the background services of the technical network system require a large amount of computing and storage resources, such as video websites, picture websites and more portal websites.
  • each item may have its own identification mark, which needs to be transmitted to the background system for logical processing. Data of different levels will be processed separately. All kinds of industry data need strong system backing support, which can only be achieved through cloud computing.
  • Cloud computing is a computing model that distributes computing tasks on a resource pool composed of a large number of computers, allowing various application systems to obtain computing power, storage space and information services as needed.
  • the network that provides resources is called a "cloud”. From the user's perspective, the resources in the "cloud” are infinitely scalable and can be obtained at any time, used on demand, expanded at any time, and paid for by use.
  • a cloud computing resource pool will be established, referred to as a cloud platform, generally referred to as an IaaS (Infrastructure as a Service) platform, in which various types of virtual resources are deployed for external customers to choose to use.
  • the cloud computing resource pool mainly includes: computing devices (virtualized machines, including operating systems), storage devices, and network devices.
  • the PaaS (Platform as a Service) layer can be deployed on the IaaS layer, and the SaaS (Software as a Service) layer can be deployed on the PaaS layer.
  • SaaS can also be deployed directly on IaaS.
  • PaaS is a platform for software operation, such as databases, Web (World Wide Web) containers, etc.
  • SaaS is a variety of business software, such as web portals, SMS mass senders, etc.
  • SaaS and PaaS are upper layers relative to IaaS.
  • Computer vision technology is a science that studies how to make machines "see”. To put it more specifically, it refers to machine vision such as using cameras and computers to replace human eyes to identify and measure targets, and further perform graphic processing so that the computer processing becomes an image that is more suitable for human eye observation or transmission to instrument detection.
  • Computer vision technology usually includes image processing, image recognition, image semantic understanding, image retrieval, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technology, virtual reality, augmented reality, simultaneous positioning and map construction, and also includes common biometric recognition technology.
  • An embodiment of the present application provides a schematic diagram of a palm-based human-computer interaction method, as shown in FIG1 .
  • the method is applied to a palm interaction device 5 provided with a proximity sensor, and the method can be executed by the palm interaction device 5 .
  • the palm interaction device 5 includes a proximity sensor 1, a camera 2, a display screen 3, and an aperture 4.
  • the palm interaction device 5 obtains sensor data of the palm collected by at least two proximity sensors arranged in an array; identifies the suspended interaction action of the palm through the sensor data of the palm of at least two proximity sensors; and the palm interaction device 5 performs a response operation of the suspended interaction action.
  • the proximity sensors 1 include four proximity sensors 101 , 102 , 103 , and 104 .
  • the four proximity sensors are arranged in a rectangular shape at the upper left, lower left, upper right, and lower right positions; or, the four proximity sensors are arranged in a diamond shape at the upper, lower, left, and right positions, but are not limited to this and the embodiments of the present application do not make specific limitations on this.
  • the aperture 4 is ring-shaped, and surrounds the proximity sensor 1, the camera 2 and the display screen 3.
  • the proximity sensor 1 is arranged in a rectangular shape at the upper left, lower left, upper right and lower right positions, and the display screen 3 is rectangular.
  • Aperture 4 is used to assist in implementing a palm-based human-computer interaction method.
  • the range of the interaction is determined by the range of aperture 4, and the operator performs a suspended interaction action within the range of aperture 4 or within a first range outside aperture 4, with the center of the circle of aperture 4 as the center; or, during suspended interaction, the interaction result between the palm and the palm interaction device 5 is fed back through the color or brightness change of aperture 4.
  • aperture 4 flashes; after the interaction between the palm and the palm interaction device 5 fails, aperture 4 flashes continuously.
  • the mid-air interactive action includes an offset sweeping action and/or a long and short slapping action.
  • the suspended interaction action refers to an interaction action triggered in the suspended area 6 above the palm interaction device 5. That is, the suspended interaction action realizes the control operation of the palm interaction device 5 without touching the screen or buttons.
  • the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, and oblique sweeping, but is not limited thereto.
  • the embodiments of the present application do not specifically limit this.
  • two proximity sensors are distributed in upper and lower positions, and the offset sweeping action includes sweeping from top to bottom and/or sweeping from bottom to top; two proximity sensors are distributed in left and right positions, and the offset sweeping action includes sweeping from left to right and/or sweeping from right to left; two proximity sensors are distributed in upper left and lower right positions, and the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, sweeping from upper left to lower right, and sweeping from lower right to upper left; two proximity sensors are distributed in upper left and lower right positions, and the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, sweeping from upper left to lower right, and sweeping from lower right to upper left.
  • the devices are distributed at the lower left and upper right positions, and the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, sweeping from lower left to upper right, and sweeping from upper right to lower left.
  • the three proximity sensors are distributed in a V-shaped position, and the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, and sweeping from bottom to top; the three proximity sensors are distributed in the upper left, lower left, and upper right positions, or in the lower left, upper right, and lower right positions, and the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, sweeping from bottom left to upper right, and sweeping from upper right to lower left; the three proximity sensors are distributed in the upper left, lower left, and lower right positions, or in the upper left, upper right, and lower right positions, and the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, sweeping from upper left to lower right, and sweeping from lower right to upper left.
  • four proximity sensors are arranged in a rectangle at the upper left, lower left, upper right, and lower right positions, and the sweeping action includes at least one of swiping from left to right, swiping from right to left, swiping from top to bottom, swiping from bottom to top, swiping from upper left to lower right, swiping from lower right to upper left, swiping from lower left to upper right, and swiping from upper right to lower left; or, four proximity sensors are arranged in a diamond shape at the upper, lower, left, and right positions, and the offset sweeping action includes at least one of swiping from top to bottom, swiping from bottom to top, swiping from left to right, and swiping from right to left.
  • the near-far slapping action refers to an air-space action of moving the palm toward or away from the palm interaction device 5 .
  • the palm interaction device 5 detects the offset sweeping action and performs a first response operation of the offset sweeping action.
  • the palm interaction device 5 detects the far-near slapping action and performs a second response operation of the far-near slapping action.
  • the palm interaction device 5 determines the sweeping direction of the offset sweeping action through the action parameter value of the offset sweeping action in the sensor data; and performs a first response operation based on the sweeping direction.
  • the first response operation includes a switching operation; the palm interaction device 5 determines the sweeping direction of the offset sweeping action based on the action parameter value of the offset sweeping action; and the palm interaction device 5 performs a switching operation based on the sweeping direction.
  • the step of determining the sweeping direction includes: in response to the palm entering a first position area, determining the position of the palm in the first position area as the starting point; in response to the palm moving from the first position area into the second position area, determining the position of the palm in the second position area as the end point; when the time for the palm to move from the starting point to the end point is less than a first time threshold, determining the direction from the starting point to the end point as the sweeping direction.
  • the position of the palm in the first position area is determined as the starting point; when the palm enters the second position area, the position of the palm in the second position area is determined as the end point; when the time for the palm to move from the starting point to the end point is less than the first time threshold, the direction pointing from the starting point to the end point is determined as the sweeping direction corresponding to the offset sweeping action.
  • a range whose distance from the plane where the aperture is located is greater than the first distance and less than the second distance is determined as the first measurement area, and the first measurement area is further divided into first measurement areas of proximity sensors on different sides, such as the first measurement area of the proximity sensor 101 on the upper left side, the first measurement area of the proximity sensor 102 on the lower left side, the first measurement area of the proximity sensor 103 on the upper right side, and the first measurement area of the proximity sensor 104 on the lower right side.
  • the above-mentioned first distance is less than the second distance.
  • the first position area includes the first measurement area of the proximity sensor 1 on the first side, but does not include the first measurement area of the proximity sensor 1 on the second side;
  • the second position area includes the first measurement area of the proximity sensor 1 on the second side, but does not include the first measurement area of the proximity sensor 1 on the first side.
  • the first side here is the opposite side to the second side.
  • the position of the palm on the left side is determined as the starting point.
  • the position of the palm on the right side is determined as the end point.
  • the switching direction corresponding to the offset sweeping action is from left to right, that is, the switching operation is switching from left to right.
  • the first measurement area refers to an effective measurement area in which the proximity sensor 1 can measure a target object.
  • the second measurement area refers to a close-range measurement area where the proximity sensor 1 can measure a target object.
  • the short-distance measurement area is set to H1, such as H1 is 0-3 cm; the effective measurement area is set to H2, such as H2 is 3-15 cm, that is, the first distance is 3 cm and the second distance is 15 cm (centimeter).
  • the difference between the distance measurement values measured by the upper left proximity sensor 101 and the lower left proximity sensor 102 on the left side is less than a preset value; when the palm is located in the right measurement area of the upper right proximity sensor 103 and the lower right proximity sensor 104 on the right side, the difference between the distance measurement values measured by the upper right proximity sensor 103 and the lower right proximity sensor 104 on the right side is less than the preset value.
  • the palm interaction device determines the operation category of the far and near tapping action through the action parameter value of the far and near tapping action in the sensor data; and performs the second response operation based on the operation category.
  • the second response operation includes a tapping operation, such as a single-click operation, a double-click operation, etc., which can be used as at least one of the following: a selection operation, a confirmation operation, a return operation, an expansion (such as page expansion, list expansion, detail expansion, etc.) operation.
  • a tapping operation such as a single-click operation, a double-click operation, etc.
  • an expansion such as page expansion, list expansion, detail expansion, etc.
  • the palm interaction device determines the time point when the palm enters the first measurement area as the first starting time point; in response to the distance measurement values measured by at least two first proximity sensors decreasing at the same time in the first time period and increasing or remaining unchanged at the same time in the second time period, the operation category of the long and short tapping action is determined to be a tapping operation, wherein the first proximity sensor is included in the at least two proximity sensors, the first time period is a time period starting from the first starting time point, the second time period is a time period starting from the end time point of the first time period, and the first time period is greater than the second time period, that is, the duration of the first time period is greater than the duration of the second time period.
  • the second response operation may also include a back-off operation, such as back-off to the desktop, back-off to the previous page, etc.
  • a back-off operation such as back-off to the desktop, back-off to the previous page, etc.
  • the time point when the palm enters the first measurement area is determined as the second starting time point; in response to the distance measurement values measured by at least two second proximity sensors increasing simultaneously within the first time period, the operation category of the far-near slapping action is determined to be a back-off operation, wherein the second proximity sensor is included in the at least two proximity sensors, and the first time period is a time period starting from the second starting time point.
  • the operation category of the far-near slapping action is determined to be a back-off operation, wherein the second time period is a time period starting from the end time point of the first time period, and the first time period is greater than the second time period, that is, the duration of the first time period is greater than the duration of the second time period.
  • the second response operation includes a selection operation; the palm interaction device 5 determines the category of the selection operation to which the far and near tapping action belongs based on the action parameter value of the far and near tapping action; the palm interaction device 5 performs the selection operation based on the category of the selection operation.
  • the selection operation may include two types of selection confirmation operation and exit operation.
  • the operator can implement the confirmation operation or exit operation of the system interface selection through the far and near tapping action.
  • the time point when the palm enters the first measurement area is determined as the first starting time point T1; in the first time period T2 after the first starting time point T1, the distance measurement value measured by the proximity sensor 1 decreases at the same time, and in the second time period T3 after the first time period T2, the distance measurement value measured by the proximity sensor 1 increases or remains unchanged at the same time, then it is determined that the selection operation corresponding to the far and near tapping action is a confirmation operation, the first time period is greater than the second time period, and T1, T2, and T3 are positive numbers.
  • the operator can implement selection operations or exit operations on the system interface through the long and short tapping action.
  • the second starting time point t1 is determined; within the first time period t2 after the second starting time point t1, when the distance measurement values measured by the at least two proximity sensors 1 increase at the same time, it is determined that the selection operation corresponding to the long and short tapping action is an exit operation.
  • the second starting time point t1 is determined; within the first time period t2 after the second starting time point t1, the distance measurement values measured by the proximity sensor 1 increase at the same time, and within the first time period t2 In a second time period t3 after the time period, when the distance measurement values measured by the proximity sensor 1 decrease or remain unchanged at the same time, it is determined that the selection operation corresponding to the far or near tapping action is an exit operation, and t1, t2, and t3 are positive numbers.
  • the palm interaction device 5 obtains a palm image of the palm through the camera 2; the palm interaction device 5 determines the object identifier corresponding to the palm image based on the palm image; when the object identifier is determined and the palm stays in the second measurement area of the proximity sensor 1 for a time greater than a stay time threshold, the interaction mode is entered.
  • the palm interaction device 5 responds to the offset sweeping action by displaying a first response operation of the offset sweeping action on the display screen 3; the palm interaction device 5 responds to the far-near tapping action by displaying a second response operation of the far-near tapping action on the display screen 3.
  • At least two proximity sensors are arranged in an array on the palm interaction device.
  • the sensor data of the palm is collected by the at least two proximity sensors.
  • the suspended interaction action of the palm can be identified based on the sensor data, and the response operation of the suspended interaction action can be performed.
  • the user can control the palm interaction device without touching or any physical contact, which provides a new interaction mode between the user and the device and can improve the interaction efficiency between the user and the device.
  • Fig. 2 shows a schematic diagram of the architecture of a computer system provided by an embodiment of the present application.
  • the computer system may include: a palm interaction device 100 and a server 200.
  • the palm interaction device 100 can be an electronic device such as a mobile phone, a tablet computer, a vehicle terminal (vehicle machine), a wearable device, a personal computer (PC), a palm image recognition voice interaction device, a palm image recognition appliance, a vehicle terminal, an aircraft, an unmanned vending terminal, etc.
  • the palm interaction device 100 can be installed with a client that runs a target application.
  • the target application can be a reference application based on palm interaction, or it can be other applications that provide palm interaction functions, and this application does not limit this.
  • this application does not limit the form of the target application, including but not limited to applications (Application, App) installed in the palm interaction device 100, applets, etc., and can also be in the form of a web page.
  • Server 200 may be an independent physical server, or a server cluster or distributed system composed of multiple physical servers, or a cloud server that provides cloud computing services, cloud database, cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, content delivery networks (CDN), and cloud servers for basic cloud computing services such as big data and artificial palm image recognition platforms.
  • Server 200 may be a backend server of the above-mentioned target application, used to provide backend services for the client of the target application.
  • the above-mentioned server can also be implemented as a node in a blockchain system.
  • Blockchain is a new application mode of computer technologies such as distributed data storage, peer-to-peer transmission, consensus mechanism, encryption algorithm, etc.
  • Blockchain is essentially a decentralized database, a string of data blocks generated by cryptographic methods. Each data block contains a batch of network transaction information, which is used to verify the validity of its information (anti-counterfeiting) and generate the next block.
  • Blockchain can include the blockchain underlying platform, platform product service layer and application service layer.
  • the palm interaction device 100 and the server 200 may communicate via a network, such as a wired or wireless network.
  • the execution subject of each step can be a palm-based interaction device, and the palm-based interaction device refers to an electronic device with data calculation, processing and storage capabilities.
  • the palm-based human-computer interaction method can be executed by the palm-based interaction device 100 (such as the client of the target application installed and running in the palm-based interaction device 100 executes the palm-based human-computer interaction method), or the palm-based human-computer interaction method can be executed by the server 200, or the palm-based interaction device 100 and the server 200 interact and cooperate to execute, and this application does not limit this.
  • FIG3 is a flow chart of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • the method is used for a palm interaction device having a proximity sensor, that is, the method can be executed by the palm interaction device.
  • the method includes:
  • Step 302 Obtain palm sensor data collected by at least two proximity sensors.
  • At least two proximity sensors are arranged in an array on the palm interaction device.
  • the two proximity sensors are arranged at the upper left and lower right positions; or Alternatively, the two proximity sensors are arranged at the lower left and upper right positions; alternatively, the two proximity sensors are arranged at the upper and lower positions; alternatively, the two proximity sensors are arranged at the left and right positions.
  • the three proximity sensors are arranged in a triangular shape; the three proximity sensors are arranged at the upper left, lower left and upper right positions; or, the three proximity sensors are arranged at the upper left, upper right and lower right positions; or, the three proximity sensors are arranged at the upper left, lower left and lower right positions; or, the three proximity sensors are arranged at the lower left, upper right and lower right positions.
  • the four proximity sensors are arranged in a rectangular shape at the upper left, lower left, upper right, and lower right positions; or, the four proximity sensors are arranged in a diamond shape at the upper, lower, left, and right positions, but are not limited to this and the embodiments of the present application do not make specific limitations on this.
  • the two proximity sensors are symmetrically arranged.
  • a proximity sensor is a non-contact sensor that can sense the proximity of an object and/or measure the distance of an object.
  • Step 304 Identify the hovering interaction action of the palm through the sensor data of the palm of at least two proximity sensors.
  • the suspended interaction action refers to the interaction action triggered in the suspended area above the palm interaction device.
  • the palm interaction device detects that the distance between the position of the palm and the plane where at least two proximity sensors are located is less than the interaction distance threshold, and the time the palm stays at the above position is greater than the time threshold, and determines that the suspended interaction action is triggered.
  • the above-mentioned suspended interaction action realizes the control operation of the palm interaction device without touching the screen or buttons.
  • the mid-air interactive action includes at least one of an offset sweeping action and a long and short slapping action.
  • the offset sweeping action refers to a sweeping action performed in the same plane.
  • the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, and sweeping in an oblique direction (such as sweeping from the upper left to the lower right, and/or sweeping from the lower left to the upper right, and/or sweeping from the upper right to the lower left, and/or sweeping from the lower right to the upper left), but is not limited thereto, and the embodiments of the present application do not specifically limit this.
  • the near-far tapping action refers to an air-based action of moving the palm toward or away from the palm interaction device.
  • the palm interaction device determines, based on sensor data of at least two proximity sensors, that the palm moves in a direction away from the first side proximity sensor and in a direction close to the second side proximity sensor, and that the change in the distance between the palm and the plane where the at least two proximity sensors are located during the movement is less than a change threshold, then identifies the palm's suspended interaction action as an offset sweep action.
  • the second side is the side opposite to the first side; for example, the first side is the left side and the second side is the right side, or the first side is the right side and the second side is the left side; another example, the first side is the upper side and the second side is the lower side, or the first side is the lower side and the second side is the upper side; another example, the first side is the upper left side and the second side is the lower right side, or the first side is the lower right side and the second side is the upper left side; another example, the first side is the lower left side and the second side is the upper right side, or the first side is the upper right side and the second side is the lower left side.
  • the first distance measurement value of the upper left and/or lower left proximity sensor gradually increases, it is determined that the palm moves in a direction away from the left side; when the second distance measurement value of the upper right and/or lower right proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the right side; the change value between the first distance measurement value at the initial moment of the action and the second distance measurement value at the end moment of the action is calculated; when the palm moves in a direction away from the left side or in a direction close to the right side, and the change value is less than a change threshold, it is determined that the suspended interaction action is an offset swiping action from left to right.
  • the first distance measurement value of the upper left and/or lower left proximity sensor gradually decreases, it is determined that the palm moves in a direction closer to the left side; when the second distance measurement value of the upper right and/or lower right proximity sensor gradually increases, it is determined that the palm moves in a direction away from the right side; the change value between the second distance measurement value at the initial moment of the action and the first distance measurement value at the end moment is calculated; when the palm moves in a direction closer to the left side or moves away from the right side, and the change value is less than a change threshold, it is determined that the suspended interaction action is an offset swiping action from right to left.
  • the third distance measurement value of the upper left and/or upper right proximity sensor gradually increases, it is determined that the palm moves in a direction away from the upper side; when the fourth distance measurement value of the lower left and/or lower right proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the lower side; the change value between the third distance measurement value at the initial moment of the action and the fourth distance measurement value at the end moment is calculated; when the palm moves in a direction away from the upper side or in a direction close to the lower side, and the change value is less than a change threshold, it is determined that the suspended interaction action is an offset sweeping action from top to bottom.
  • the third distance measurement value of the upper left and/or upper right proximity sensor gradually decreases, it is determined that the palm moves in a direction closer to the upper side; when the fourth distance measurement value of the lower left and/or lower right proximity sensor gradually increases, it is determined that the palm moves in a direction away from the lower side; the change value between the fourth distance measurement value at the initial moment of the action and the third distance measurement value at the end moment is calculated; when the palm moves in a direction closer to the upper side or in a direction away from the lower side, and the change value is less than a change threshold, it is determined that the suspended interaction action is an offset sweeping action from bottom to top.
  • the fifth distance measurement value of the upper left proximity sensor gradually increases, it is determined that the palm moves in a direction away from the upper left side; when the sixth distance measurement value of the lower right proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the lower right side; the change value between the fifth distance measurement value at the initial moment of the action and the sixth distance measurement value at the end moment is calculated; when the palm moves in a direction away from the upper left side and in a direction close to the lower right side, and the change value is less than the change threshold, it is determined that the suspended interaction action is an offset sweeping action from the upper left to the lower right.
  • the fifth distance measurement value of the upper left proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the upper left side; when the sixth distance measurement value of the lower right proximity sensor gradually increases, it is determined that the palm moves in a direction away from the lower right side; the change value between the sixth distance measurement value at the initial moment of the action and the fifth distance measurement value at the end moment is calculated; when the palm moves in a direction close to the upper left side and in a direction away from the lower right side, and the change value is less than the change threshold, it is determined that the suspended interaction action is an offset swiping action from the lower right to the upper left.
  • the seventh distance measurement value of the upper right proximity sensor gradually increases, it is determined that the palm moves in a direction away from the upper right side; when the eighth distance measurement value of the lower left proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the lower left side; the change value between the seventh distance measurement value at the initial moment of the action and the eighth distance measurement value at the end moment is calculated; when the palm moves in a direction away from the upper right side and in a direction close to the lower left side, and the change value is less than the change threshold, it is determined that the suspended interaction action is an offset swiping action from the upper right to the lower left.
  • the seventh distance measurement value of the upper right proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the upper right side; when the eighth distance measurement value of the lower left proximity sensor gradually increases, it is determined that the palm moves in a direction away from the lower left side; the change value between the eighth distance measurement value at the initial moment of the action and the seventh distance measurement value at the end moment is calculated; when the palm moves in a direction close to the upper right side and in a direction away from the lower left side, and the change value is less than the change threshold, it is determined that the suspended interaction action is an offset swiping action from the lower left to the upper right.
  • the first distance measurement value of the upper left and/or lower left proximity sensor gradually increases, it is determined that the palm moves in a direction away from the left side; when the second distance measurement value of the upper right proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the right side; the change value between the first distance measurement value at the initial moment of the action and the second distance measurement value at the end moment of the action is calculated; when the palm moves in a direction away from the left side or in a direction close to the right side, and the change value is less than a change threshold, it is determined that the suspended interaction action is an offset swiping action from left to right.
  • the first distance measurement value of the upper left proximity sensor gradually increases, it is determined that the palm moves in a direction away from the left side; when the second distance measurement value of the lower right proximity sensor gradually decreases, it is determined that the palm moves in a direction close to the right side; the change value between the first distance measurement value at the initial moment of the action and the second distance measurement value at the end moment of the action is calculated; when the palm moves in a direction away from the left side and in a direction close to the right side, and the change value is less than the change threshold, it is determined that the suspended interaction action is an offset swiping action from left to right.
  • the palm interaction device determines that the suspended interaction action is a far-near tapping action.
  • the far-near tapping action is determined to be a tapping operation, and the first proximity sensor is included in the at least two proximity sensors; in response to the distance measurement values measured by at least two second proximity sensors increasing simultaneously within a first time period, and decreasing simultaneously or remaining unchanged within a second time period, the far-near tapping action is determined to be a back-off operation, and the second proximity sensor is included in the at least two proximity sensors; wherein the first time period is a time period starting from a first starting time point, and the second time period is a time period starting from an end time point of the first time period, and the first time period is greater than the second time
  • Step 306 Execute a response operation of the hovering interaction action.
  • the palm interaction device executes the response operation of the suspended interaction action.
  • the response operation of the suspended interaction action refers to the operation of the device in response to the suspended interaction action.
  • different suspended interaction actions correspond to different response operations.
  • the suspended interaction operation is an offset swiping action
  • the response operation can be a page switching operation, a page sliding operation, or a sliding deletion/selection operation, etc.
  • the response operation can be a confirmation operation, or a return operation, etc.
  • the correspondence between the palm interaction device and the response operation can be a system default setting or a user-defined one.
  • the palm interaction device when the palm interaction device is set up, when the proximity sensor in the palm interaction device recognizes that the palm is swiping from left to right, the palm interaction device performs a page turning operation according to the swiping action from left to right, that is, switches the display of the interface on the left.
  • the method provided in this embodiment collects sensor data through at least two proximity sensors arranged in an array on the palm interaction device, and identifies the suspended interaction action of the palm based on the sensor data, and controls the palm interaction device to perform the response operation corresponding to the suspended interaction action according to the suspended interaction action.
  • This application provides a new interaction method, which identifies the suspended interaction operation of the palm through the proximity sensor, so that the user can control the palm interaction device without touching or any physical contact, thereby improving the interaction efficiency.
  • FIG4 is a flow chart of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • the method is used for a palm interaction device having a proximity sensor, and the method can be executed by the palm interaction device.
  • the method includes:
  • Step 402 Acquire palm sensor data collected by at least two proximity sensors.
  • At least two proximity sensors are arranged in an array on the palm interaction device.
  • Step 404 Identify the mid-air interactive action of the palm through the sensor data of the palm collected by at least two proximity sensors.
  • the suspended interaction action refers to an interaction action triggered in the suspended area above the palm interaction device.
  • the above-mentioned suspended interaction action realizes the control operation of the palm interaction device without touching the screen or buttons.
  • the palm interaction device displays a first response operation of an offset sweeping action on the display screen; and the palm interaction device displays a second response operation of a near or far tapping action on the display screen.
  • FIG. 5 A schematic diagram of a palm interaction device is shown in Figure 5.
  • the palm interaction device includes four proximity sensors 501, a camera 502, a display screen 503, and an aperture 504.
  • the display screen 503 is disposed on the left side of the palm interaction device, but is not limited thereto.
  • the embodiment of the present application does not specifically limit the position and size of the display screen 503.
  • the palm interaction device recognizes the hovering interaction action of the palm through the proximity sensor 501 and performs a response operation corresponding to the hovering interaction action.
  • Aperture 504 is used to assist in implementing a palm-based human-computer interaction method.
  • the range of the interaction is determined by the range of aperture 504, and the operator performs a suspended interaction action within the range of aperture 504 or within a first range outside aperture 504, with the center of the circle of aperture 504 as the center; or, during suspended interaction, the interaction result between the palm and the palm interaction device is fed back through the color or brightness change of aperture 504.
  • aperture 504 flashes; after the interaction between the palm and the palm interaction device fails, aperture 504 flashes continuously.
  • Step 406 Execute a first response operation of the offset sweep action.
  • An offset sweeping motion refers to a sweeping motion performed in the same plane.
  • the offset sweeping action includes at least one of sweeping from left to right, sweeping from right to left, sweeping from top to bottom, sweeping from bottom to top, and sweeping diagonally, but is not limited thereto.
  • the embodiments of the present application do not make specific limitations on this.
  • the palm interaction device performs a first response operation of the offset sweep action.
  • the first response operation includes at least one of a switching operation, a page turning operation, a translation operation, and a display ratio adjustment operation, but is not limited thereto and is not specifically limited in the embodiments of the present application.
  • the first response operation includes a switching operation; the palm interaction device determines a sweeping direction corresponding to the offset sweeping action based on an action parameter value of the offset sweeping action; and the palm interaction device performs the switching operation based on the sweeping direction.
  • the position of the palm in the first position area is determined as the starting point; when the palm enters the second position area, the position of the palm in the second position area is determined as the end point; when the time for the palm to move from the starting point to the end point is less than a first time threshold, the direction from the starting point to the end point is determined as the sweeping direction of the offset sweeping action.
  • the first position area includes the first measurement area of the proximity sensor on the first side, but does not include the first measurement area of the proximity sensor on the second side;
  • the second position area includes the first measurement area of the proximity sensor on the second side, but does not include the first measurement area of the proximity sensor on the first side.
  • the first measurement area refers to an effective measurement area in which the proximity sensor can measure the target object.
  • the second measurement area refers to a close-range measurement area where the proximity sensor can measure the target object.
  • the close-range measurement area is set to H1, for example, H1 is an area within 0-3 cm away from the proximity sensor; the effective measurement area is set to H2, for example, H2 is an area within 3-15 cm away from the proximity sensor.
  • the position of the palm on the left is determined as the starting point.
  • the position of the palm on the right is determined as the end point.
  • the switching direction of the offset sweeping action is from left to right, that is, the switching operation is switching from left to right.
  • the palm interaction device performs a switching operation from left to right based on the switching direction of the offset sweeping action.
  • the difference between the distance measurement values measured by at least two proximity sensors on the first side is less than a preset value; when the palm is located in the right measurement area of the proximity sensor on the second side, the difference between the distance measurement values measured by at least two proximity sensors on the second side is less than a preset value.
  • FIG6 a schematic diagram of the switching operation corresponding to the offset sweeping action is shown in FIG6.
  • Four proximity sensors 601 and a display screen 602 are provided in the palm interaction device, and the four proximity sensors 601 are arranged in a rectangular shape in the upper left, lower left, upper right, and lower right positions.
  • the position of the palm at the current moment is determined as the starting point.
  • the position of the palm at the current moment is determined as the end point.
  • the switching direction corresponding to the offset sweeping action is from right to left, that is, the switching operation is switching from right to left.
  • the palm interaction device displays the system interface switching from right to left on the display screen 602 based on the switching direction of the offset swiping action, that is, switching from the "Hello! Administrator" interface to the "Settings! Application Settings” interface.
  • the first response operation includes a translation operation.
  • a schematic diagram of a translation operation corresponding to an offset sweeping action is shown in FIG7 .
  • the palm interaction device includes four proximity sensors 701, a camera 702, a display screen 703, and an aperture 704.
  • the display screen 703 is disposed on the left side of the palm interaction device, but is not limited thereto.
  • the embodiment of the present application does not specifically limit the position and size of the display screen 703.
  • the palm interaction device displays a palm identifier 705 corresponding to the palm on the display screen 703.
  • the movement trajectory of the palm identifier 705 in the display screen 703 is used to indicate an offset sweeping action of the palm in a suspended area outside the palm interaction device.
  • the palm identifier 705 when the palm swipes from left to right in the suspended area outside the palm interaction device, the palm identifier 705 also moves from left to right in the display screen 703, so that the user can know the position of the palm relative to the palm interaction device in real time.
  • the first response operation includes adjusting the display ratio.
  • the corresponding schematic diagram of the display ratio adjustment operation is shown.
  • the display screen includes a sliding control, which includes a slide rail 802 and a slider 801.
  • the palm interaction device obtains the palm image based on the camera and determines the object identifier corresponding to the palm image
  • the palm identifier 803 corresponding to the palm is displayed on the display screen.
  • the palm identifier 803 in the display screen moves relatively.
  • the display ratio of the local area where the slider 801 is located is enlarged from the first ratio to the second ratio, and the second ratio is greater than the first ratio.
  • the palm identifier 803 controls the slider 801 to move on the slide rail 802, thereby realizing the operation of adjusting the display ratio of the environmental parameters.
  • the display ratio of the local area where the slider 801 is located is enlarged by 1.5 times.
  • the palm identifier 803 controls the movement of the slider 801 on the slide track 802.
  • the slider 801 on the slide track 802 corresponding to the environmental parameter is also adjusted from left to right.
  • Step 408 Execute a second response operation of the far or near tapping action.
  • the second response operation includes at least one of a confirmation operation, an exit operation, a quantity increase operation, and a quantity decrease operation, but is not limited thereto and the embodiments of the present application do not make specific limitations on this.
  • the second response operation includes a selection operation.
  • the palm interaction device determines the category of the selection operation to which the far and near tapping action belongs based on the action parameter value of the far and near tapping action; and performs the selection operation based on the category of the selection operation.
  • the selection operation includes at least one of a confirmation operation and an exit operation, but is not limited thereto.
  • the time point when the palm enters the first measurement area is determined as the first starting time point; within a first time period after the first starting time point, the distance measurement values measured by at least two proximity sensors decrease at the same time, and within a second time period after the first time period, the distance measurement values measured by at least two proximity sensors increase at the same time or remain unchanged, it is determined that the selection operation corresponding to the far or near slapping action is a confirmation operation, and the first time period is greater than the second time period.
  • the operator can implement the selection operation of the system interface through the far and near tapping action.
  • the first starting time point T1 is determined; in the first time period T2 after the first starting time point T1, the distance measurement values measured by at least two proximity sensors decrease at the same time, and in the second time period T3 after the first time period T2, the distance measurement values measured by at least two proximity sensors increase or remain unchanged at the same time, then it is determined that the selection operation corresponding to the far and near tapping action is a determination operation.
  • a schematic diagram of determining operations corresponding to the far and near tapping action is shown in Fig. 9.
  • the palm interaction device 903 is provided with four proximity sensors 901 and a display screen 902, and the four proximity sensors 901 are arranged in a rectangular manner at the upper left, lower left, upper right, and lower right positions.
  • the palm interaction device 903 when the palm enters the first measurement area of at least three proximity sensors 901 at the same time, the first starting time point T1 is determined; in the first time period T2 after the first starting time point T1, the distance measurement values measured by at least three proximity sensors 901 decrease at the same time, and in the second time period T3 after the first time period T2, the distance measurement values measured by at least three proximity sensors 901 increase at the same time or remain unchanged, that is, in the time period T1+T2, the palm moves in the direction approaching the palm interaction device 903, and in the time period (T1+T2) to (T1+T2+T3), the palm moves in the direction away from the palm interaction device 903 or remains motionless, then, the palm interaction device determines that the selection operation corresponding to the far and near slapping action is a confirmation operation, which can also be called an air tap operation.
  • a confirmation operation which can also be called an air tap operation.
  • the time when the palm enters the first measurement area at the same time is determined as the second starting time point; within the first time period after the second starting time point, when the distance measurement values measured by at least two proximity sensors increase at the same time, it is determined that the selection operation corresponding to the long and short slapping action is the exit operation.
  • the time when the palm enters the first measurement area at the same time is determined as the second starting time point; within a first time period after the second starting time point, the distance measurement values measured by the at least two proximity sensors increase at the same time, and within a second time period after the first time period, the distance measurement values measured by the at least two proximity sensors decrease at the same time or remain unchanged, the selection operation corresponding to the far or near tapping action is determined. as an exit operation.
  • the operator can exit the system interface through the far and near tapping action.
  • the second starting time point t1 is determined; within the first time period t2 after the second starting time point t1, when the distance measurement values measured by at least two proximity sensors increase at the same time, it is determined that the selection operation corresponding to the far and near tapping action is the exit operation.
  • a second starting time point t1 is determined; within a first time period t2 after the second starting time point t1, the distance measurement values measured by at least two proximity sensors increase at the same time, and within a second time period t3 after the first time period, the distance measurement values measured by at least two proximity sensors decrease at the same time or remain unchanged, it is determined that the selection operation corresponding to the far or near slapping action is an exit operation.
  • FIG. 10 Exemplarily, a schematic diagram of the exit operation corresponding to the near and far tapping action is shown in FIG10.
  • Four proximity sensors 1001 and a display screen 1002 are provided in the palm interaction device, and the four proximity sensors 1001 are arranged in a rectangular shape in the upper left, lower left, upper right, and lower right positions.
  • the palm interaction device 1003 when the palm is simultaneously located in the front half (close to the palm interaction device part) of the first measurement area of at least three proximity sensors 1001, the second starting time point t1 is determined; within the first time period t2 after the second starting time point t1, when the distance measurement values measured by at least three proximity sensors 1001 increase at the same time, that is, within the time period t1+t2, the palm moves in a direction away from the palm interaction device 1003, then the palm interaction device determines that the selection operation corresponding to the near and far tapping action is an exit operation, which can also be called an air back operation.
  • the second starting time point t1 is determined; within the first time period t2 after the second starting time point t1, the distance measurement values measured by at least three proximity sensors 1001 increase simultaneously, and within the second time period t3 after the first time period t2, the distance measurement values measured by at least three proximity sensors 1001 decrease simultaneously or remain unchanged, that is, within the time period t1+t2, the palm moves away from the palm interaction device 1003, and within the time period (t1+t2) to (t1+t2+t3), the palm moves toward the direction close to the palm interaction device or remains motionless, then, the palm interaction device determines that the selection operation corresponding to the far or near slapping action is an exit operation, which can also be called an air back-off operation.
  • an exit operation which can also be called an air back-off operation.
  • the second response operation includes quantity increase and decrease operations.
  • a schematic diagram of quantity increase and decrease operations corresponding to the near and far tapping actions is shown in FIG11.
  • a dish mark 1102 is displayed on the display screen 1103.
  • the palm interaction device 1104 obtains the palm image based on the camera and determines the object identifier corresponding to the palm image
  • the palm mark 1101 corresponding to the palm is displayed on the display screen 1103.
  • the palm mark 1101 in the display screen moves relatively.
  • the dish mark 1102 is selected, and the increase or decrease of the number of portions of the selected dish mark 1102 is realized by the near and far tapping action.
  • the first starting time point T1 is determined; within the first time period T2 after the first starting time point T1, the distance measurement values measured by at least two proximity sensors decrease at the same time, and within the second time period T3 after the first time period T2, the distance measurement values measured by at least two proximity sensors increase at the same time or remain unchanged, then, it is determined that the selection operation corresponding to the far or near slapping action is a quantity increase operation.
  • a second starting time point t1 is determined; within a first time period t2 after the second starting time point t1, the distance measurement values measured by the at least two proximity sensors increase at the same time, and within a second time period t3 after the first time period, the distance measurement values measured by the at least two proximity sensors decrease at the same time or remain unchanged, it is determined that the selection operation corresponding to the far or near slapping action is a quantity reduction operation.
  • the palm interaction device before obtaining sensor data collected by at least two proximity sensors, it also includes: the palm interaction device obtains a palm image of the palm through a camera; determines the object identification of the palm based on the palm image; and enters the interaction mode when the object identification is determined and the palm stays in the second measurement area of the proximity sensor for a time greater than a stay time threshold.
  • FIG12 a schematic diagram of a palm interaction device entering an interaction mode.
  • the palm interaction device is provided with four proximity sensors 1201, a display screen 1202, and a camera 1203.
  • the four proximity sensors 1201 are located at the upper left, lower left, and
  • the palm interaction device obtains a palm image of the palm through the camera 1203; the palm interaction device determines the object identification of the palm based on the palm image; when the object identification is determined and the palm stays in the second measurement area of the proximity sensor 1201 for a time greater than the stay time threshold, the interaction mode is entered, and the setting interface is displayed on the display screen 1202.
  • the dwell time threshold is set to 15 seconds.
  • the palm image is a palm image of the target object whose identity is to be determined.
  • the palm image contains a palm, which is the palm of the target object whose identity is to be verified.
  • the palm image may also contain other information, such as the fingers of the target object, the scene when the target object's palm is photographed, etc.
  • the finger gap point is the first finger gap point 1301 between the index finger and the middle finger, or the finger gap point is the second finger gap point 1302 between the middle finger and the ring finger, or the finger gap point is the third finger gap point 1303 between the ring finger and the little finger.
  • finger gap point detection is performed on the palm image to obtain at least one finger gap point of the palm image, so that the palm area can be determined subsequently based on the at least one finger gap point.
  • the palm interaction device captures the subject's palm to obtain a palm image.
  • the palm image includes the palm, which may be the subject's left palm or the subject's right palm.
  • the palm interaction device is an Internet of Things device, which captures the subject's left palm through a camera to obtain a palm image.
  • the camera includes a color camera and an infrared camera.
  • the palm interaction device obtains a color image of the palm through the color camera; obtains an infrared image of the same palm through the infrared camera; the palm interaction device performs palm recognition processing based on the color image and the infrared image, and determines the object identifier corresponding to the palm image, that is, determines the object identifier of the palm.
  • the color image refers to the image obtained by imaging the palm with a color camera based on natural light.
  • Infrared images refer to images obtained by an infrared camera imaging the palm based on infrared light.
  • the palm interaction device further performs palm comparison and recognition processing based on the color image and the infrared image to determine the object identification of the palm, wherein the palm comparison and recognition processing refers to comparing and recognizing the features of the palm area with the preset palm features in the database.
  • the preset palm feature is the palm feature of the palm corresponding to the stored object identifier.
  • Each preset palm feature has a corresponding object identifier, indicating that the preset palm feature belongs to the object identifier and is the palm feature of the palm of the object.
  • the object identifier can be any object identifier, such as the object identifier registered in the payment application, or the object identifier registered in the enterprise.
  • the palm interaction device includes a database, which includes a plurality of preset palm features and an object identifier corresponding to each preset palm feature.
  • the preset palm features and the object identifiers may correspond one to one, or one object identifier may correspond to at least two preset palm features.
  • the object identifier is determined by performing palm comparison and recognition processing on the infrared image, the color image and the preset palm features in the database, thereby realizing identity authentication of the object.
  • the method provided in this embodiment collects sensor data through at least two proximity sensors arranged in an array on the palm interaction device, and identifies the offset sweeping action and the far and near slapping action of the palm based on the sensor data, and controls the palm interaction device to perform switching operations, confirming operations, and exiting operations according to the offset sweeping action and the far and near slapping action.
  • This application provides a new interaction method, which identifies the suspended interaction operation of the palm through the proximity sensor, so that the user can control the palm interaction device without touching or any physical contact, thereby improving the interaction efficiency.
  • the palm as a type of biometric feature, is biologically unique and distinguishable. Compared with facial recognition, which is currently widely used in the fields of identity verification, payment, access control, and ride-hailing, the palm will not be affected by makeup, masks, sunglasses, etc., and can improve the accuracy of object identity verification. In some scenarios, such as epidemic prevention and control scenarios, it is necessary to wear a mask to cover the mouth. Nose, in this case using palm image for authentication can be a better choice.
  • FIG. 14 is a schematic diagram of cross-device hovering interaction of a palm-based human-computer interaction method provided by an exemplary embodiment of the present application.
  • the method involves a first palm interaction device 1401 , a second palm interaction device 1403 , and an application server 1402 .
  • the first palm interaction device 1401 is installed with an application, for example, a payment application.
  • the first palm interaction device 1401 logs in to the payment application based on the object identifier and establishes a communication connection with the application server 1402.
  • the first palm interaction device 1401 and the application server 1402 can interact, and the object can control the first palm interaction device 1401 through a suspended interaction action, for example, to set up the payment application;
  • the second palm interaction device 1403 is installed with a payment application.
  • the second palm interaction device 1403 logs in to the payment application based on the merchant identifier and establishes a communication connection with the application server 1402.
  • the second palm interaction device 1403 and the application server 1402 can interact, and the object can control the second palm interaction device 1403 through a suspended interaction action, for example, to set up the payment application.
  • the cross-device suspended interaction process includes:
  • the subject holds the first palm interaction device 1401 at home, and uses the first palm interaction device 1401 to shoot the subject's own palm to obtain the subject's palm image, logs in to the payment application based on the subject's identity, and sends a palm image registration request to the application server 1402, where the palm image registration request carries the subject's identity and palm image.
  • the application server 1402 receives the palm image registration request sent by the first palm interaction device 1401, processes the palm image, obtains the palm features of the palm image, stores the palm features corresponding to the object identifier, and sends a palm image binding success notification to the first palm interaction device 1401.
  • the palm feature is used as a preset palm feature, and the corresponding object identifier can be determined later by using the stored preset palm feature.
  • the first palm interaction device 1401 receives the palm image binding success notification, displays the palm image binding success notification, and prompts that the object palm image is bound to the object identifier.
  • the object completes palm image registration through the interaction between its first palm interaction device 1401 and the application server 1402, and can subsequently quickly realize recognition through the palm image, thereby controlling the first palm interaction device 1401 to enter the human-computer interaction mode, and controlling the first palm interaction device 1401 through the suspended interaction action.
  • the second palm interaction device 1403 captures the palm of the subject to obtain a palm image, and based on the payment application logged in to the second palm interaction device 1403, sends a payment application setting request to the application server 1402, and the payment application setting request carries the identifier of the second palm interaction device 1403, the suspended interaction action and the palm image.
  • the application server 1402 After receiving the payment application setting request, the application server 1402 performs palm comparison and recognition processing on the palm image, determines the object identifier of the palm image, completes the payment application setting based on the suspended interaction action, and after the payment application setting is completed, sends a payment application setting completion notification to the second palm interaction device 1403.
  • the object can directly set the payment application on the second palm interaction device 1403 without the need for the object to register the palm image on the second palm interaction device 1403, thereby achieving the effect of cross-device palm-based human-computer interaction and improving convenience.
  • the second palm interaction device 1403 displays the payment application setup completion notification to prompt the object that the payment application setup is complete, so that the object knows that the second palm interaction device 1403 has been setup complete.
  • the palm-based human-computer interaction function provided in the embodiment of the present application is only one function of the palm interaction device.
  • the application scenarios of the palm interaction device include but are not limited to the following scenarios:
  • the merchant's palm interaction device obtains a palm image of the object by photographing the palm of the object, and uses the palm-based human-computer interaction method provided in the embodiment of the present application to determine the target object identifier of the palm image, and transfers part of the resources in the resource account corresponding to the target object identifier to the merchant's resource account, thereby realizing automatic payment through the palm.
  • the subject can complete the identity registration using a personal mobile phone at home or other private space, and associate the subject's account with the subject's After binding the palm image, the palm image of the object can be recognized on the in-store device, the account number of the object can be determined, and payment can be made directly through the account.
  • the palm interaction device obtains a palm image of the object by photographing the palm of the object, and uses the palm-based human-computer interaction method provided in the embodiment of the present application to determine the target object identifier of the palm image, establish a punch-in mark for the target object identifier, and determine that the target object identifier has completed the work punch-in at the current time.
  • the method provided in the embodiments of the present application can also be applied to other scenarios that require palm-based human-computer interaction, and the embodiments of the present application are not limited to specific application scenarios.
  • FIG15 shows a schematic diagram of the structure of a palm interaction device provided by an exemplary embodiment of the present application.
  • the device can be implemented as all or part of a palm interaction device through software, hardware, or a combination of both.
  • the device includes:
  • An acquisition module 1501 is used to acquire palm sensor data collected by at least two proximity sensors, where the at least two proximity sensors are arranged in an array on the palm interaction device;
  • the recognition module 1502 is used to recognize the hovering interaction action of the palm through the sensor data of the palm collected by at least two proximity sensors;
  • the control module 1503 is used to execute the response operation of the hovering interaction action.
  • the mid-air interactive action includes at least one of an offset sweeping action and a long-short slapping action
  • the control module 1503 is also used to execute a first response operation of the offset sweep action
  • the control module 1503 is also used to execute a second response operation of the far or near tapping action.
  • control module 1503 is further used to determine the sweeping direction of the offset sweeping action through the action parameter value of the offset sweeping action in the sensor data; and based on the sweeping direction, control the palm interaction device to perform a first response operation.
  • control module 1503 is also used to determine the position of the palm in the first position area as the starting point in response to the palm entering the first position area; determine the position of the palm in the second position area as the end point in response to the palm moving from the first position area to the second position area; and determine the direction from the starting point to the end point as the sweeping direction when the time for the palm to move from the starting point to the end point is less than a first time threshold.
  • control module 1503 is further configured to determine the operation category of the far and near tapping action through the action parameter values of the far and near tapping action in the sensor data; and control the palm interaction device to perform a second response operation based on the operation category.
  • control module 1503 is further used to determine the time point when the palm enters the first measuring area as the first starting time point in response to the palm entering the first measuring area of at least two proximity sensors; in response to the distance measurement values measured by the at least two first proximity sensors decreasing simultaneously within the first time period and increasing simultaneously or remaining unchanged within the second time period, determine that the operation category of the long and short slapping action is a tapping operation, wherein the first proximity sensor is included in the at least two proximity sensors, the first time period is a time period starting from the first starting time point, the second time period is a time period starting from the end time point of the first time period, and the first time period is greater than the second time period.
  • control module 1503 is further used to determine the time point when the palm enters the first measuring area as the second starting time point in response to the palm entering the first measuring area of at least two proximity sensors; in response to the distance measurement values measured by at least two second proximity sensors increasing simultaneously within the first time period, determine that the operation category of the long and short tapping action is a back-off operation, wherein the second proximity sensor is included in the at least two proximity sensors, and the first time period is a time period starting from the second starting time point.
  • control module 1503 is also used to determine that the operation category of the long or short tapping action is a back-off operation in response to the distance measurement values measured by at least two second proximity sensors increasing simultaneously within a first time period and decreasing simultaneously or remaining unchanged within a second time period, wherein the second time period is a time period starting from the end time point of the first time period, and the first time period is greater than the second time period.
  • the acquisition module 1501 is further configured to acquire a palm image of the palm through a camera;
  • the recognition module 1502 is further configured to determine an object identifier of the palm based on the palm image;
  • the control module 1503 is further configured to enter the interactive mode when the object identifier is determined and the time the palm stays in the second measurement area of the proximity sensor is greater than a stay time threshold.
  • the acquisition module 1501 is further configured to acquire a color image of the palm through a color camera, where the color image refers to an image obtained by imaging the palm with a color camera based on natural light;
  • the acquisition module 1501 is further used to acquire an infrared image of the same palm through an infrared camera.
  • the infrared image refers to an image obtained by imaging the palm with infrared light using an infrared camera.
  • the recognition module 1502 is further used to perform palm recognition processing based on the color image and the infrared image to determine the object identifier of the palm.
  • the display module 1504 is used to display the first response operation of the offset sweep action on the display screen
  • the display module 1504 is used to display the second response operation of the far or near tapping action on the display screen.
  • FIG16 shows a block diagram of a palm interaction device 1600 according to an exemplary embodiment of the present application.
  • the palm interaction device 1600 may be a smart phone, a tablet computer, a desktop computer, a smart watch, a robot, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, etc.
  • MP3 Motion Picture Experts Group Audio Layer III
  • MP4 Motion Picture Experts Group Audio Layer IV
  • the palm interaction device 1600 includes: a processor 1601 and a memory 1602 .
  • the processor 1601 may include one or more processing cores, such as a 4-core processor, an 8-core processor, etc.
  • the processor 1601 may be implemented in at least one hardware form of DSP (Digital Signal Processing), FPGA (Field-Programmable Gate Array), and PLA (Programmable Logic Array).
  • the processor 1601 may also include a main processor and a coprocessor.
  • the main processor is a processor for processing data in the awake state, also known as a CPU (Central Processing Unit); the coprocessor is a low-power processor for processing data in the standby state.
  • the processor 1601 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content to be displayed on the display screen.
  • the processor 1601 may also include an AI (Artificial Intelligence) processor, which is used to process computing operations related to machine learning.
  • AI Artificial Intelligence
  • the memory 1602 may include one or more computer-readable storage media, which may be tangible and non-transitory.
  • the memory 1602 may also include a high-speed random access memory, and a non-volatile memory, such as one or more disk storage devices, flash memory storage devices.
  • the non-transitory computer-readable storage medium in the memory 1602 is used to store at least one instruction, which is used to be executed by the processor 1601 to implement the palm-based human-computer interaction method provided in the present application.
  • the palm interaction device 1600 may further include: a peripheral device interface 1603 and at least one peripheral device.
  • the peripheral device includes: at least one of a display screen 1604 and a camera assembly 1605 .
  • the peripheral device interface 1603 may be used to connect at least one peripheral device related to I/O (Input/Output) to the processor 1601 and the memory 1602.
  • the processor 1601, the memory 1602, and the peripheral device interface 1603 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 1601, the memory 1602, and the peripheral device interface 1603 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
  • Display screen 1604 is used to display UI (User Interface).
  • the UI may include graphics, text, icons, videos and any combination thereof.
  • display screen 1604 may be a touch screen, which also has the ability to collect touch signals on the surface of the touch screen or above the surface.
  • the touch signal may be input as a control signal to processor 1601 for processing.
  • the touch screen is used to provide virtual buttons and/or virtual keyboards, also known as soft buttons and/or soft keyboards.
  • display screen 1604 may be one, disposed on the front panel of palm interaction device 1600; in other embodiments, display screen 1604 may be at least two, disposed on different surfaces of palm interaction device 1600 or in a folding design; in other embodiments, display screen 1604 may be a flexible display disposed on the bend of palm interaction device 1600.
  • the display screen 1604 can be arranged on a curved surface or a folded surface. Even more, the display screen 1604 can be arranged in a non-rectangular irregular shape, that is, a special-shaped screen.
  • the display screen 1604 can be made of materials such as LCD (Liquid Crystal Display) and OLED (Organic Light-Emitting Diode).
  • the camera assembly 1605 is used to capture images or videos.
  • the camera assembly 1605 includes a front camera and a rear camera.
  • the front camera is used to realize video calls or selfies
  • the rear camera is used to realize the shooting of photos or videos.
  • there are at least two rear cameras which are any one of a main camera, a depth of field camera, and a wide-angle camera, so as to realize the fusion of the main camera and the depth of field camera to realize the background blur function, and the fusion of the main camera and the wide-angle camera to realize the panoramic shooting and VR (Virtual Reality) shooting function.
  • the camera assembly 1605 may also include a flash.
  • the flash can be a single-color temperature flash or a dual-color temperature flash.
  • the dual-color temperature flash refers to a combination of a warm light flash and a cold light flash, which can be used for light compensation at different color temperatures.
  • the camera may include a color camera and an infrared camera.
  • the palm interaction device 1600 further includes one or more sensors 1606 .
  • the one or more sensors 1606 include but are not limited to: a proximity sensor 1607 .
  • the proximity sensor 1607 also known as a distance sensor, is usually arranged on the front of the palm interaction device 1600.
  • the proximity sensor 1607 is used to collect the distance between the user and the front of the palm interaction device 1600.
  • at least two proximity sensors 1607 are arranged in an array on the palm interaction device 1600.
  • the two proximity sensors are arranged at the upper left and lower right positions; or, the two proximity sensors are arranged at the lower left and upper right positions; or, the two proximity sensors are arranged at the upper and lower positions; or, the two proximity sensors are arranged at the left and right positions.
  • the two proximity sensors are a left proximity sensor 1701 and a right proximity sensor 1702.
  • the three proximity sensors are arranged in a herringbone shape; the three proximity sensors are arranged in the upper left, lower left, and upper right positions; or the three proximity sensors are arranged in the upper left, upper right, and lower right positions; or the three proximity sensors are arranged in the upper left, lower left, and lower right positions; or the three proximity sensors are arranged in the lower left, upper right, and lower right positions.
  • the three proximity sensors are respectively an upper sensor 1801, a lower left sensor 1802, and a lower right sensor 1803.
  • the offset swiping action in the left-right direction is recognized by the lower left proximity sensor and the lower right proximity sensor
  • the offset swiping action in the upper-lower direction is recognized by the upper proximity sensor, the lower left proximity sensor, and the lower right proximity sensor.
  • the four proximity sensors are arranged in a rectangular shape at the upper left, lower left, upper right, and lower right positions; or, the four proximity sensors are arranged in a diamond shape at the upper, lower, left, and right positions.
  • the four proximity sensors are respectively an upper left proximity sensor 101, a lower left proximity sensor 102, an upper right proximity sensor 103, and a lower right proximity sensor 104.
  • FIG. 16 does not limit the computer device 1600 , and may include more or fewer components than shown in the figure, or combine certain components, or adopt a different component arrangement.
  • An embodiment of the present application also provides a palm interaction device, which includes a processor, a memory and at least two proximity sensors; the at least two proximity sensors collect sensor data of the palm, and store the above-mentioned palm sensor data in the memory; the memory stores at least one program, and the at least one program is loaded and executed by the processor to implement the palm-based human-computer interaction method provided by the above-mentioned method embodiments.
  • a palm interaction device which includes a processor, a memory and at least two proximity sensors; the at least two proximity sensors collect sensor data of the palm, and store the above-mentioned palm sensor data in the memory; the memory stores at least one program, and the at least one program is loaded and executed by the processor to implement the palm-based human-computer interaction method provided by the above-mentioned method embodiments.
  • An embodiment of the present application also provides a computer-readable storage medium, in which at least one computer program is stored.
  • the at least one computer program is loaded and executed by a processor to implement the palm-based human-computer interaction method provided by the above-mentioned method embodiments.
  • An embodiment of the present application also provides a computer program product, which includes a computer program, and the computer program is stored in a computer-readable storage medium; the computer program is read and executed from the computer-readable storage medium by a processor of a palm interaction device, so that the palm interaction device executes to implement the palm-based human-computer interaction method provided by the above-mentioned method embodiments.

Abstract

一种基于掌部的人机交互方法、装置、设备、介质及程序产品,属于计算机技术领域。基于掌部的人机交互方法包括:掌部交互设备(5,100,903,1003,,1104,1401,1403)获取至少两个接近传感器(1,101,102,103,104,501,601,701,901,1001,1201)采集的掌部的传感器数据,至少两个接近传感器(1,101,102,103,104,501,601,701,901,1001,1201)在掌部交互设备(5,100,903,1003,1104,1401,1402)上阵列设置(302);通过至少两个接近传感器(1,101,102,103,104,501,601,701,901,1001,1201)采集的掌部的传感器数据,识别掌部的悬空交互动作(304);执行悬空交互动作的响应操作(306)。提供了一种新的交互方式,通过接近传感器(1,101,102,103,104,501,601,701,901,1001,1201)识别掌部的悬空交互操作,从而无需触摸或任何物理性地接触便可实现用户对掌部交互设备(5,100,903,1003,,1104,1401,1402)的控制,提高了交互效率。

Description

基于掌部的人机交互方法、装置、设备、介质及程序产品
本申请要求于2022年09月29日提交的申请号为202211196351.5、发明名称为“基于掌部的人机交互方法、装置、设备、介质及程序产品”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机技术领域,特别涉及一种基于掌部的人机交互方法、装置、设备、介质及程序产品。
背景技术
随着计算机技术的发展,人机交互的方式越来越多样。
在一些终端中设置有触摸屏,终端通过触摸屏采集用户的触发操作并通过网络传输到服务器,从而实现人机交互。
在用户与一些具有小型屏幕的终端进行交互时,如何保证用户和终端实现准确的交互,是亟待解决的重要问题。
发明内容
本申请提供了一种基于掌部的人机交互方法、装置、设备、介质及程序产品,所述技术方案如下:
根据本申请的一方面,提供了一种基于掌部的人机交互方法,所述方法应用于掌部交互设备,所述方法包括:
获取至少两个接近传感器(Proximity sensor)采集的所述掌部的传感器数据,所述至少两个接近传感器在所述掌部交互设备上阵列设置;
通过所述至少两个接近传感器采集的所述掌部的传感器数据,识别悬空交互动作;
执行所述悬空交互动作的响应操作。
根据本申请的一方面,提供了一种掌部交互装置,所述装置包括:
获取模块,用于获取至少两个接近传感器采集的掌部的传感器数据,所述至少两个接近传感器在掌部交互设备上阵列设置;
识别模块,用于通过所述至少两个接近传感器采集的所述掌部的传感器数据,识别掌部的悬空交互动作;
控制模块,用于执行所述悬空交互动作的响应操作。
根据本申请的另一方面,提供了一种掌部交互设备,该掌部交互设备包括:处理器、存储器和至少两个接近传感器,至少两个接近传感器采集掌部的传感器数据,且将上述掌部的传感器数据存储于存储器中;存储器中存储有至少一条计算机程序,至少一条计算机程序由处理器加载并执行以实现如上方面所述的基于掌部的人机交互方法。
根据本申请的另一方面,提供了一种计算机存储介质,计算机可读存储介质中存储有至少一条计算机程序,至少一条计算机程序由处理器加载并执行以实现如上方面所述的基于掌部的人机交互方法。
根据本申请的另一方面,提供了一种计算机程序产品,上述计算机程序产品包括计算机程序,所述计算机程序存储在计算机可读存储介质中;所述计算机程序由掌部交互设备的处理器从所述计算机可读存储介质读取并执行,使得所述掌部交互设备执行如上方面所述的基于掌部的人机交互方法。
本申请提供的技术方案带来的有益效果至少包括:
掌部交互设备上阵列设置有至少两个接近传感器,通过上述至少两个接近传感器采集掌部的传感器数据,可以基于传感器数据识别出掌部的悬空交互动作,执行悬空交互动作的响应操作,无需触摸或任何物理性地接触便可实现用户对掌部交互设备的控制,提供了一种用 户与设备之间的新交互方式,可以提高用户与设备之间的交互效率。
附图说明
图1是本申请一个示例性实施例提供的一种基于掌部的人机交互方法的示意图;
图2是本申请一个示例性实施例提供的计算机系统的架构示意图;
图3是本申请一个示例性实施例提供的基于掌部的人机交互方法的流程图;
图4是本申请一个示例性实施例提供的基于掌部的人机交互方法的流程图;
图5是本申请一个示例性实施例提供的掌部交互设备的示意图;
图6是本申请一个示例性实施例提供的偏移挥扫动作对应的切换操作的示意图;
图7是本申请一个示例性实施例提供的偏移挥扫动作对应的平移操作的示意图;
图8是本申请一个示例性实施例提供的偏移挥扫动作对应的调节显示比例操作的示意图;
图9是本申请一个示例性实施例提供的远近拍击动作对应的确定操作的示意图;
图10是本申请一个示例性实施例提供的远近拍击动作对应的退出操作的示意图;
图11是本申请一个示例性实施例提供的远近拍击动作对应的数量增、减操作的示意图;
图12是本申请一个示例性实施例提供的掌部交互设备进入交互模式的示意图;
图13是本申请一个示例性实施例提供的手掌中手指缝点的示意图;
图14是本申请一个示例性实施例提供的基于掌部的人机交互方法的跨设备悬空交互的示意图;
图15是本申请一个示例性实施例提供的掌部交互装置的框图;
图16是本申请一个示例性实施例提供的掌部交互设备的结构示意图;
图17是本申请一个示例性实施例提供的接近传感器的排布示意图;
图18是本申请一个示例性实施例提供的接近传感器的排布示意图。
具体实施方式
首先对本申请实施例涉及的若干个名词进行简介:
人工智能(Artificial Intelligence,AI)是利用数字计算机或者数字计算机控制的机器模拟、延伸和扩展人的智能,感知环境、获取知识并使用知识获得最佳结果的理论、方法、技术及应用系统。换句话说,人工智能是计算机科学的一个综合技术,它企图了解智能的实质,并生产出一种新的能以人类智能相似的方式做出反应的智能机器。人工智能也就是研究各种智能机器的设计原理与实现方法,使机器具有感知、推理与决策的功能。
人工智能技术是一门综合学科,涉及领域广泛,既有硬件层面的技术也有软件层面的技术。人工智能基础技术一般包括如传感器、专用人工智能芯片、云计算、分布式存储、大数据处理技术、操作/交互系统、机电一体化等技术。人工智能软件技术主要包括计算机视觉技术、语音处理技术、自然语言处理技术以及机器学习/深度学习等几大方向。
云技术(Cloud technology)是指在广域网或局域网内将硬件、软件、网络等系列资源统一起来,实现数据的计算、储存、处理和共享的一种托管技术。
云技术(Cloud technology)基于云计算商业模式应用的网络技术、信息技术、整合技术、管理平台技术、应用技术等的总称,可以组成资源池,按需所用,灵活便利。云计算技术将变成重要支撑。技术网络系统的后台服务需要大量的计算、存储资源,如视频网站、图片类网站和更多的门户网站。伴随着互联网行业的高度发展和应用,将来每个物品都有可能存在自己的识别标志,都需要传输到后台系统进行逻辑处理,不同程度级别的数据将会分开处理,各类行业数据皆需要强大的系统后盾支撑,只能通过云计算来实现。
云计算(Cloud computing)是一种计算模式,它将计算任务分布在大量计算机构成的资源池上,使各种应用系统能够根据需要获取计算力、存储空间和信息服务。提供资源的网络被称为“云”。“云”中的资源在使用者看来是可以无限扩展的,并且可以随时获取,按需使用,随时扩展,按使用付费。
作为云计算的基础能力提供商,会建立云计算资源池,简称云平台,一般称为IaaS(Infrastructure as a Service,基础设施即服务)平台,在资源池中部署多种类型的虚拟资源,供外部客户选择使用。云计算资源池中主要包括:计算设备(为虚拟化机器,包含操作系统)、存储设备、网络设备。
按照逻辑功能划分,在IaaS层上可以部署PaaS(Platform as a Service,平台即服务)层,PaaS层之上再部署SaaS(Software as a Service,软件即服务)层,也可以直接将SaaS部署在IaaS上。PaaS为软件运行的平台,如数据库、Web(World Wide Web,全球广域网)容器等。SaaS为各式各样的业务软件,如web门户网站、短信群发器等。一般来说,SaaS和PaaS相对于IaaS是上层。
计算机视觉技术(Computer Vision,CV)是一门研究如何使机器“看”的科学,更进一步的说,就是指用摄影机和电脑代替人眼对目标进行识别和测量等机器视觉,并进一步做图形处理,使电脑处理成为更适合人眼观察或传送给仪器检测的图像。作为一个科学学科,计算机视觉研究相关的理论和技术,试图建立能够从图像或者多维数据中获取信息的人工智能系统。计算机视觉技术通常包括图像处理、图像识别、图像语义理解、图像检索、视频处理、视频语义理解、视频内容/行为识别、三维物体重建、3D技术、虚拟现实、增强现实、同步定位与地图构建等技术,还包括常见的生物特征识别技术。
本申请实施例提供了一种基于掌部的人机交互方法的示意图,如图1所示,该方法应用于设置有接近传感器的掌部交互设备5,该方法可以由掌部交互设备5执行。
示例性地,如图1中的(a)图所示,掌部交互设备5包括接近传感器1、摄像头2、显示屏3和光圈4。掌部交互设备5获取阵列设置的至少两个接近传感器采集的掌部的传感器数据;通过至少两个接近传感器的掌部的传感器数据,识别掌部的悬空交互动作;掌部交互设备5执行悬空交互动作的响应操作。
接近传感器1包括四个,分别为左上接近传感器101、左下接近传感器102、右上接近传感器103、右下接近传感器104。
可选地,在有四个接近传感器的情况下,四个接近传感器以左上、左下、右上、右下位置矩形排布;或,四个接近传感器以上、下、左、右位置菱形排布,但不限于此,本申请实施例对此不作具体限定。
光圈4呈环状,光圈4包围接近传感器1、摄像头2和显示屏3,接近传感器1以左上、左下、右上、右下位置矩形排布,显示屏3呈矩形。
光圈4用于辅助实现基于掌部的人机交互方法。例如,通过光圈4的范围确定交互的范围,操作者以光圈4的圆心为中心,在光圈4范围内或光圈4以外的第一范围内进行悬空交互动作;或,在悬空交互时,通过光圈4的颜色或亮度变化,反馈出掌部与掌部交互设备5之间的交互结果。比如,在掌部与掌部交互设备5之间交互成功后,光圈4闪一下;在掌部与掌部交互设备5之间交互失败后,光圈4连续闪动。
示例性地,悬空交互动作包括偏移挥扫动作和/或远近拍击动作。
如图1中的(b)图所示,悬空交互动作是指在掌部交互设备5的上方悬空区域6内触发的交互动作。也就是说,上述悬空交互动作实现了在不触摸屏幕或按键的情况下对掌部交互设备5的控制操作。
可选地,偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、斜向挥扫中的至少一种,但不限于此,本申请实施例对此不作具体限定。比如,2个接近传感器以上下的位置分布,偏移挥扫动作包括从上往下挥扫和/或从下往上挥扫;2个接近传感器以左右的位置分布,偏移挥扫动作包括从左往右挥扫和/或从右往左挥扫;2个接近传感器以左上、右下的位置分布,偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、从左上往右下挥扫、从右下往左上挥扫中的至少一种;2个接近传感 器以左下、右上的位置分布,偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、从左下往右上挥扫、从右上往左下挥扫中的至少一种。
又如,3个接近传感器以品字形的位置分布,偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫中的至少一种;3个接近传感器以左上、左下、右上的位置分布,或者以左下、右上、右下的位置分布,偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、从左下往右上挥扫、从右上往左下挥扫中的至少一种;3个接近传感器以左上、左下、右下的位置分布,或者以左上、右上、右下的位置分布,偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、从左上往右下挥扫、从右下往左上挥扫中的至少一种。
又如,4个接近传感器以左上、左下、右上、右下位置矩形排布,从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、从左上往右下挥扫、从右下往左上挥扫、从左下往右上挥扫、从右上往左下挥扫中的至少一种;或,四个接近传感器以上、下、左、右位置菱形排布,偏移挥扫动作包括从上往下挥扫、从下往上挥扫、从左往右挥扫、从右往左挥扫中的至少一种。
可选地,远近拍击动作是指掌部朝向掌部交互设备5接近或远离的隔空动作。
示例性地,掌部交互设备5检测到偏移挥扫动作,执行偏移挥扫动作的第一响应操作。掌部交互设备5检测到远近拍击动作,执行远近拍击动作的第二响应操作。
可选地,掌部交互设备5通过传感器数据中偏移挥扫动作的动作参数值,确定出偏移挥扫动作的挥扫方向;基于挥扫方向,执行第一响应操作。示例性地,第一响应操作包括切换操作;掌部交互设备5基于偏移挥扫动作的动作参数值,确定出偏移挥扫动作的挥扫方向;掌部交互设备5基于挥扫方向执行切换操作。
示例性地,挥扫方向的确定步骤包括:响应于掌部进入第一位置区域,将掌部在第一位置区域的位置确定为起点;响应于掌部从第一位置区域移入第二位置区域,将掌部在第二位置区域的位置确定为终点;在掌部从起点移动至终点的时间小于第一时间阈值的情况下,将起点指向终点的方向确定为挥扫方向。
也即,在掌部进入第一位置区域时,将掌部在第一位置区域的位置确定为起点;在掌部进入第二位置区域时,将掌部在第二位置区域的位置确定为终点;在掌部从起点移动至终点的时间小于第一时间阈值的情况下,将起点指向终点的方向确定为偏移挥扫动作对应的挥扫方向。
在光圈之上的悬空范围内,确定与光圈所在平面之间距离大于第一距离且小于第二距离的范围为第一测量区域,第一测量区域又被划分为不同侧的接近传感器的第一测量区域,比如左上侧的接近传感器101的第一测量区域,左下侧的接近传感器102的第一测量区域,右上侧的接近传感器103的第一测量区域,右下侧的接近传感器104的第一测量区域。上述第一距离小于第二距离。
其中,第一位置区域包括第一侧的接近传感器1的第一测量区域,但不包括第二侧的接近传感器1的第一测量区域;第二位置区域包括第二侧的接近传感器1的第一测量区域,但不包括第一侧的接近传感器1的第一测量区域。这里的第一侧是第二侧的相反一侧。
例如,在掌部位于左侧的左上接近传感器101、左下接近传感器102的第一测量区域内,且不位于右侧的右上接近传感器103、右下接近传感器104的第一测量区域内时,将掌部在左侧的位置确定为起点。
在掌部位于右侧的右上接近传感器103、右下接近传感器104的第一测量区域内,且不位于左侧的左上接近传感器101、左下接近传感器102的第一测量区域内时,将掌部在右侧的位置确定为终点。
在掌部在5s内从起点移动至终点时,确定出偏移挥扫动作对应的切换方向为从左往右,即,切换操作为从左往右切换。
第一测量区域是指接近传感器1能够测量目标物品的有效测量区域。
第二测量区域是指接近传感器1能够测量目标物品的近距测量区域。
例如,设置近距测量区域为H1,比如H1为0-3cm;设置有效测量区域为H2,比如H2为3-15cm,也即第一距离为3cm,第二距离为15cm(厘米)。
可选地,为了保证测量过程中掌部与掌部交互设备5保持平齐,在掌部位于左侧的左上接近传感器101、左下接近传感器102的第一测量区域内时,左侧中的左上接近传感器101、左下接近传感器102测得的距离测量值之间的差值小于预设值;在掌部位于右侧的右上接近传感器103、右下接近传感器104的右测量区域内时,右侧中的右上接近传感器103、右下接近传感器104测得的距离测量值之间的差值小于预设值。
示例性地,掌部交互设备通过传感器数据中远近拍击动作的动作参数值,确定出远近拍击动作的操作类别;基于操作类别,执行第二响应操作。
可选地,上述第二响应操作包括按击操作,比如是单击操作、双击操作等按击操作,可以用作以下至少一项:选择操作、确认操作、返回操作、展开(比如页面展开、列表展开、详情展开等)操作。响应于掌部进入至少两个接近传感器的第一测量区域内,掌部交互设备将掌部进入第一测量区域的时间点确定为第一起始时间点;响应于至少两个第一接近传感器测得的距离测量值在第一时间段内同时减小、且在第二时间段内同时增大或保持不变,确定出远近拍击动作的操作类别为按击操作,其中,第一接近传感器包含于至少两个接近传感器中,第一时间段是以第一起始时间点为始的时间段,第二时间段是以第一时间段的结束时间点为始的时间段,第一时间段大于第二时间段,也即第一时间段的时长大于第二时间段的时长。
可选地,上述第二响应操作还可以包括回退操作,比如是回退至桌面、回退至上一页面等操作。响应于掌部进入至少两个接近传感器的第一测量区域内,将掌部进入第一测量区域的时间点确定为第二起始时间点;响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大,确定出远近拍击动作的操作类别为回退操作,其中,第二接近传感器包含于至少两个接近传感器中,第一时间段是以第二起始时间点为始的时间段。可选地,响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大、且在第二时间段内同时减小或保持不变,确定出远近拍击动作的操作类别为回退操作,其中,第二时间段是以第一时间段的结束时间点为始的时间段,第一时间段大于第二时间段,也即第一时间段的时长大于第二时间段的时长。
比如,第二响应操作包括选择操作;掌部交互设备5基于远近拍击动作的动作参数值,确定出远近拍击动作所属的选择操作的类别;掌部交互设备5基于选择操作的类别执行选择操作。
示例性地,选择操作可以包括选择的确定操作和退出操作两类。例如,在对掌部交互设备5进行系统设置的场景下,操作者可通过远近拍击动作实现对系统界面选择的确认操作或退出操作。在掌部同时进入至少两个接近传感器1的第一测量区域内时,将掌部进入第一测量区域的时间点确定为第一起始时间点T1;在第一起始时间点T1后的第一时间段T2内,接近传感器1测得的距离测量值同时减小,且在第一时间段T2后的第二时间段T3内,接近传感器1测得的距离测量值同时增大或保持不变的情况下,则确定出远近拍击动作对应的选择操作为确定操作,第一时间段大于第二时间段,T1、T2、T3为正数。
例如,在对掌部交互设备5进行系统设置的场景下,操作者可通过远近拍击动作实现对系统界面的选择操作或退出操作。在掌部同时进入至少两个接近传感器1的第一测量区域内时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,在至少两个接近传感器1测得的距离测量值同时增大的情况下,确定出远近拍击动作对应的选择操作为退出操作。可选地,在掌部进入接近传感器1的第一测量区域内时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,接近传感器1测得的距离测量值同时增大,且在第一 时间段后的第二时间段t3内,接近传感器1测得的距离测量值同时减小或保持不变的情况下,确定出远近拍击动作对应的选择操作为退出操作,t1、t2、t3为正数。
示例性地,掌部交互设备5通过摄像头2获取掌部的掌部图像;掌部交互设备5基于掌部图像确定掌部图像对应的对象标识;在确定出对象标识且掌部在接近传感器1的第二测量区域停留时间大于停留时间阈值的情况下,进入交互模式。
示例性地,掌部交互设备5响应于偏移挥扫动作,在显示屏3上显示偏移挥扫动作的第一响应操作;掌部交互设备5响应于远近拍击动作,在显示屏3上显示远近拍击动作的第二响应操作。
综上所述,本实施例提供的方法,掌部交互设备上阵列设置有至少两个接近传感器,通过上述至少两个接近传感器采集掌部的传感器数据,可以基于传感器数据识别出掌部的悬空交互动作,执行悬空交互动作的响应操作,无需触摸或任何物理性地接触便可实现用户对掌部交互设备的控制,提供了一种用户与设备之间的新交互方式,可以提高用户与设备之间的交互效率。
图2示出了本申请一个实施例提供的计算机系统的架构示意图。该计算机系统可以包括:掌部交互设备掌部交互设备100和服务器200。
掌部交互设备掌部交互设备100可以是诸如手机、平板电脑、车载终端(车机)、可穿戴设备、个人计算机(Personal Computer,PC)、掌部图像识别语音交互设备、掌部图像识别家电、车载终端、飞行器、无人售货终端等电子设备。掌部交互设备100中可以安装运行目标应用程序的客户端,该目标应用程序可以是参考基于掌部交互的应用程序,也可以是提供有基于掌部交互功能的其他应用程序,本申请对此不作限定。另外,本申请对该目标应用程序的形式不作限定,包括但不限于安装在掌部交互设备100中的应用程序(Application,App)、小程序等,还可以是网页形式。
服务器200可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云计算服务的云服务器、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、内容分发网络(Content Delivery Network,CDN)、以及大数据和人工掌部图像识别平台等基础云计算服务的云服务器。服务器200可以是上述目标应用程序的后台服务器,用于为目标应用程序的客户端提供后台服务。
在一些实施例中,上述服务器还可以实现为区块链系统中的节点。区块链(Blockchain)是分布式数据存储、点对点传输、共识机制、加密算法等计算机技术的新型应用模式。区块链,本质上是一个去中心化的数据库,是一串使用密码学方法相关联产生的数据块,每一个数据块中包含了一批次网络交易的信息,用于验证其信息的有效性(防伪)和生成下一个区块。区块链可以包括区块链底层平台、平台产品服务层以及应用服务层。
掌部交互设备100和服务器200之间可以通过网络进行通信,如有线或无线网络。
本申请实施例提供的基于掌部的人机交互方法,各步骤的执行主体可以是掌部交互设备,所述掌部交互设备是指具备数据计算、处理和存储能力的电子设备。以图2所示的方案实施环境为例,可以由掌部交互设备100执行基于掌部的人机交互方法(如掌部交互设备100中安装运行的目标应用程序的客户端执行基于掌部的人机交互方法),也可以由服务器200执行该基于掌部的人机交互方法,或者由掌部交互设备100和服务器200交互配合执行,本申请对此不作限定。
图3是本申请一个示例性实施例提供的基于掌部的人机交互方法的流程图。该方法用于具有接近传感器的掌部交互设备,也即该方法可以由上述掌部交互设备执行。该方法包括:
步骤302:获取至少两个接近传感器采集的掌部的传感器数据。
示例性地,至少两个接近传感器在掌部交互设备上阵列设置。
可选地,在有两个接近传感器的情况下,两个接近传感器以左上、右下的位置排布;或 者,两个接近传感器以左下、右上的位置排布;或者,两个接近传感器以上下的位置排布;或者,两个接近传感器以左右的位置排布。
可选地,在有三个接近传感器的情况下,三个接近传感器以品字形的位置排布;三个接近传感器以左上、左下、右上的位置排布;或者,三个接近传感器以左上、右上、右下的位置排布;或者,三个接近传感器以左上、左下、右下的位置排布;或者,三个接近传感器以左下、右上、右下的位置排布。
可选地,在有四个接近传感器的情况下,四个接近传感器以左上、左下、右上、右下位置矩形排布;或,四个接近传感器以上、下、左、右位置菱形排布,但不限于此,本申请实施例对此不作具体限定。
可选地,在有两个接近传感器的情况下,两个接近传感器对称设置。
接近传感器是一种非接触式传感器,能够感知物体是否接近,和/或,测量物体的距离。
步骤304:通过至少两个接近传感器的掌部的传感器数据,识别掌部的悬空交互动作。
悬空交互动作是指在掌部交互设备的上方悬空区域内触发的交互动作。比如,掌部交互设备检测到掌部所在的位置与至少两个接近传感器所在平面的距离小于交互距离阈值,且掌部在上述位置上停留的时长大于时长阈值,确定触发了悬空交互动作。也就是说,上述悬空交互动作实现了在不触摸屏幕或按键的情况下对掌部交互设备的控制操作。
示例性地,悬空交互动作包括偏移挥扫动作和远近拍击动作中的至少一种。
可选地,偏移挥扫动作是指在同一平面进行的挥扫性的动作。偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、斜向挥扫(比如从左上往右下的方向上挥扫、和/或从左下往右上的方向上挥扫、和/或从右上往左下的方向上挥扫、和/或从右下往左上的方向上挥扫)中的至少一种,但不限于此,本申请实施例对此不作具体限定。
可选地,远近拍击动作是指掌部朝向掌部交互设备接近或远离的隔空动作。
示例性的,掌部交互设备基于至少两个接近传感器的传感器数据,确定掌部向远离第一侧接近传感器的方向移动、向靠近第二侧接近传感器的方向移动,且在移动过程中掌部与至少两个接近传感器所在平面间距离的变化值小于变化阈值,则识别出掌部的悬空交互动作为偏移挥扫动作。其中,第二侧是第一侧相反的一侧;比如第一侧为左侧,第二侧为右侧,或者第一侧为右侧,第二侧为左侧;又如第一侧为上侧,第二侧为下侧,或者第一侧为下侧,第二侧为上侧;又如第一侧为左上侧,第二侧为右下侧,或者第一侧为右下侧,第二侧为左上侧;又如第一侧为左下侧,第二侧为右上侧,或者第一侧为右上侧,第二侧为左下侧。
比如,在有四个接近传感器、且四个接近传感器以左上、左下、右上、右下位置矩形排布的情况下,在左上和/或左下接近传感器的第一距离测量值逐渐增大的情况下,确定掌部向远离左侧的方向移动;在右上和/或右下接近传感器的第二距离测量值逐渐减小的情况下,确定掌部向靠近右侧的方向移动;计算动作的初始时刻上第一距离测量值与结束时刻上第二距离测量值之间的变化值;在掌部向远离左侧的方向移动、向靠近右侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由左向右的偏移挥扫动作。
在左上和/或左下接近传感器的第一距离测量值逐渐减小的情况下,确定掌部向靠近左侧的方向移动;在右上和/或右下接近传感器的第二距离测量值逐渐增大的情况下,确定掌部向远离右侧的方向移动;计算动作的初始时刻上第二距离测量值与结束时刻上第一距离测量值之间的变化值;在掌部向靠近左侧的方向移动、向远离右侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由右向左的偏移挥扫动作。
在左上和/或右上接近传感器的第三距离测量值逐渐增大的情况下,确定掌部向远离上侧的方向移动;在左下和/或右下接近传感器的第四距离测量值逐渐减小的情况下,确定掌部向靠近下侧的方向移动;计算动作的初始时刻上第三距离测量值与结束时刻上第四距离测量值之间的变化值;在掌部向远离上侧的方向移动、向靠近下侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由上向下的偏移挥扫动作。
在左上和/或右上接近传感器的第三距离测量值逐渐减小的情况下,确定掌部向靠近上侧的方向移动;在左下和/或右下接近传感器的第四距离测量值逐渐增大的情况下,确定掌部向远离下侧的方向移动;计算动作的初始时刻上第四距离测量值与结束时刻上第三距离测量值之间的变化值;在掌部向靠近上侧的方向移动、向远离下侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由下向上的偏移挥扫动作。
在左上接近传感器的第五距离测量值逐渐增大的情况下,确定掌部向远离左上侧的方向移动;在右下接近传感器的第六距离测量值逐渐减小的情况下,确定掌部向靠近右下侧的方向移动;计算动作的初始时刻上第五距离测量值与结束时刻上第六距离测量值之间的变化值;在掌部向远离左上侧的方向移动、向靠近右下侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由左上向右下的偏移挥扫动作。
在左上接近传感器的第五距离测量值逐渐减小的情况下,确定掌部向靠近左上侧的方向移动;在右下接近传感器的第六距离测量值逐渐增大的情况下,确定掌部向远离右下侧的方向移动;计算动作的初始时刻上第六距离测量值与结束时刻上第五距离测量值之间的变化值;在掌部向靠近左上侧的方向移动、向远离右下侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由右下向左上的偏移挥扫动作。
在右上接近传感器的第七距离测量值逐渐增大的情况下,确定掌部向远离右上侧的方向移动;在左下接近传感器的第八距离测量值逐渐减小的情况下,确定掌部向靠近左下侧的方向移动;计算动作的初始时刻上第七距离测量值与结束时刻上第八距离测量值之间的变化值;在掌部向远离右上侧的方向移动、向靠近左下侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由右上向左下的偏移挥扫动作。
在右上接近传感器的第七距离测量值逐渐减小的情况下,确定掌部向靠近右上侧的方向移动;在左下接近传感器的第八距离测量值逐渐增大的情况下,确定掌部向远离左下侧的方向移动;计算动作的初始时刻上第八距离测量值与结束时刻上第七距离测量值之间的变化值;在掌部向靠近右上侧的方向移动、向远离左下侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由左下向右上的偏移挥扫动作。
比如,在有三个接近传感器、且三个接近传感器以左上、左下、右上的位置排布的情况下,在左上和/或左下接近传感器的第一距离测量值逐渐增大的情况下,确定掌部向远离左侧的方向移动;在右上接近传感器的第二距离测量值逐渐减小的情况下,确定掌部向靠近右侧的方向移动;计算动作的初始时刻上第一距离测量值与结束时刻上第二距离测量值之间的变化值;在掌部向远离左侧的方向移动、向靠近右侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由左向右的偏移挥扫动作。
比如,在有两个接近传感器、且两个接近传感器以左上、右下的位置排布的情况下,在左上接近传感器的第一距离测量值逐渐增大的情况下,确定掌部向远离左侧的方向移动;在右下接近传感器的第二距离测量值逐渐减小的情况下,确定掌部向靠近右侧的方向移动;计算动作的初始时刻上第一距离测量值与结束时刻上第二距离测量值之间的变化值;在掌部向远离左侧的方向移动、向靠近右侧的方向移动、且变化值小于变化阈值的情况下,确定悬空交互动作为由左向右的偏移挥扫动作。
示例性地,掌部交互设备响应于至少两个接近传感器的距离测量值在第一时间段内逐渐增大或减小、且距离测量值均小于有效距离阈值,则确定悬空交互动作为远近拍击动作。比如,响应于至少两个第一接近传感器的距离测量值在第一时间段内同时减小、且在第二时间段内同时增大或保持不变,确定出远近拍击动作为按击操作,第一接近传感器包含于至少两个接近传感器中;响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大、且在第二时间段内同时减小或保持不变,确定出远近拍击动作为回退操作,第二接近传感器包含于至少两个接近传感器中;其中,第一时间段是以第一起始时间点为始的时间段,第二时间段是以第一时间段的结束时间点为始的时间段,第一时间段大于第二时间段。
步骤306:执行悬空交互动作的响应操作。
示例性地,在掌部交互设备中的接近传感器识别出掌部的悬空交互动作后,掌部交互设备执行该悬空交互动作的响应操作。其中,悬空交互动作的响应操作是指设备响应悬空交互动作的操作。在掌部交互设备中,不同的悬空交互动作对应有不同的响应操作,比如悬空交互操作是偏移挥扫动作,则响应操作可以是页面切换的操作,可以是页面滑动的操作,还可以是滑动删除/选中的操作等;又如悬空交互操作是远近拍击动作,则响应操作可以是确认操作,也可以是退回操作等。掌部交互设备与响应操作之间的对应关系可以是系统默认设置的或者用户自定义的。
例如,在对掌部交互设备进行系统设置的情况下,在掌部交互设备中的接近传感器识别出掌部是从左往右挥扫时,掌部交互设备根据该从左往右挥扫动作,执行翻页操作,即切换显示左侧的界面。
综上所述,本实施例提供的方法,通过掌部交互设备上阵列设置的至少两个接近传感器采集传感器数据,并基于传感器数据识别掌部的悬空交互动作,根据悬空交互动作,控制掌部交互设备执行悬空交互动作对应的响应操作。本申请提供了一种新的交互方式,通过接近传感器识别掌部的悬空交互操作,从而无需触摸或任何物理性地接触便可实现用户对掌部交互设备的控制,提高了交互效率。
图4是本申请一个示例性实施例提供的基于掌部的人机交互方法的流程图。该方法用于具有接近传感器的掌部交互设备,该方法可以由掌部交互设备执行。该方法包括:
步骤402:获取至少两个接近传感器采集的掌部的传感器数据。
示例性地,至少两个接近传感器在掌部交互设备上阵列设置。
步骤404:通过至少两个接近传感器采集的掌部的传感器数据,识别掌部的悬空交互动作。
悬空交互动作是指在掌部交互设备的上方悬空区域内触发的交互动作。也就是说,上述悬空交互动作实现了在不触摸屏幕或按键的情况下对掌部交互设备的控制操作。
示例性地,掌部交互设备在显示屏上显示偏移挥扫动作的第一响应操作;掌部交互设备在显示屏上显示远近拍击动作的第二响应操作。
如图5所示出的掌部交互设备的示意图。掌部交互设备包括四个接近传感器501、摄像头502、显示屏503和光圈504。显示屏503设置于掌部交互设备的左侧,但不限于此,本申请实施例对显示屏503的位置、大小不作具体限定。
掌部交互设备通过接近传感器501识别掌部的悬空交互动作,执行悬空交互动作对应的响应操作。
光圈504用于辅助实现基于掌部的人机交互方法。例如,通过光圈504的范围确定交互的范围,操作者以光圈504的圆心为中心,在光圈504范围内或光圈504以外的第一范围内进行悬空交互动作;或,在悬空交互时,通过光圈504的颜色或亮度变化,反馈出掌部与掌部交互设备之间的交互结果。比如,在掌部与掌部交互设备之间交互成功后,光圈504闪一下;在掌部与掌部交互设备之间交互失败后,光圈504连续闪动。
步骤406:执行偏移挥扫动作的第一响应操作。
偏移挥扫动作是指在同一平面进行的挥扫性的动作。
偏移挥扫动作包括从左往右挥扫、从右往左挥扫、从上往下挥扫、从下往上挥扫、斜向挥扫中的至少一种,但不限于此,本申请实施例对此不作具体限定。
示例性地,掌部交互设备执行偏移挥扫动作的第一响应操作。
示例性地,第一响应操作包括切换操作、翻页操作、平移操作、调节显示比例操作中的至少一种,但不限于此,本申请实施例对此不作具体限定。
可选地,第一响应操作包括切换操作;掌部交互设备基于偏移挥扫动作的动作参数值,确定出偏移挥扫动作对应的挥扫方向;掌部交互设备基于挥扫方向执行切换操作。
可选地,在掌部进入第一位置区域时,将掌部在第一位置区域的位置确定为起点;在掌部进入第二位置区域时,将掌部在第二位置区域的位置确定为终点;在掌部从起点移动至终点的时间小于第一时间阈值的情况下,将起点指向终点的方向确定为偏移挥扫动作的挥扫方向。
其中,第一位置区域包括第一侧的接近传感器的第一测量区域,但不包括第二侧的接近传感器的第一测量区域;第二位置区域包括第二侧的接近传感器的第一测量区域,但不包括第一侧的接近传感器的第一测量区域。
第一测量区域是指接近传感器能够测量目标物品的有效测量区域。
第二测量区域是指接近传感器能够测量目标物品的近距测量区域。
例如,设置近距测量区域为H1,比如,H1为远离接近传感器的0-3cm内的区域;设置有效测量区域为H2,比如,H2为远离接近传感器的3-15cm内的区域。
例如,在掌部位于左侧的接近传感器的第一测量区域内,且不位于右侧的接近传感器的第一测量区域内时,将掌部在左侧的位置确定为起点。在掌部位于右侧的接近传感器的第一测量区域内,且不位于左侧的接近传感器的第一测量区域内时,将掌部在右侧的位置确定为终点。在掌部在5s内从起点移动至终点时,确定出偏移挥扫动作的切换方向为从左往右,即,切换操作为从左往右切换。掌部交互设备基于偏移挥扫动作的切换方向,执行从左往右切换的切换操作。
可选地,为了保证测量过程中掌部与掌部交互设备保持平齐,在掌部位于第一侧的接近传感器的第一测量区域内时,第一侧中的至少两个接近传感器测得的距离测量值之间的差值小于预设值;在掌部位于第二侧的接近传感器的右测量区域内时,第二侧中的至少两个接近传感器测得的距离测量值之间的差值小于预设值。
示例性地,如图6所示出的偏移挥扫动作对应的切换操作的示意图。在掌部交互设备中设置有四个接近传感器601和显示屏602,四个接近传感器601以左上、左下、右上、右下位置矩形排布。在对掌部交互设备进行系统设置的场景下,在掌部同时位于右上、右下两个接近传感器601的第一测量区域内,且不位于左上、左下两个接近传感器601的第一测量区域内时,确定当前时刻的掌部所在的位置为起点。在掌部同时位于左上、左下两个接近传感器601的第一测量区域内,且不位于右上、右下两个接近传感器601的第一测量区域内时,确定当前时刻的掌部所在位置为终点。在掌部在5s内从起点移动至终点时,确定出偏移挥扫动作对应的切换方向为从右往左,即,切换操作为从右往左切换。掌部交互设备基于偏移挥扫动作的切换方向,在显示屏602上显示从右往左切换系统界面,即,从“你好!管理员”界面切换至“设置!应用设置”界面。
在掌部同时位于左上、左下两个接近传感器601的第一测量区域内,且位于右上、右下两个接近传感器601的第一测量区域内的情况下,此时,接近传感器601测得的数据为无效数据。
在掌部从第一位置移动至第二位置的时间超过第一时间阈值(5s)的情况下,此时,接近传感器601在掌部在第一位置、第二位置测得的数据为无效数据。
可选地,第一响应操作包括平移操作。示例性地,如图7所示出的偏移挥扫动作对应的平移操作的示意图。掌部交互设备包括四个接近传感器701、摄像头702、显示屏703和光圈704。显示屏703设置于掌部交互设备的左侧,但不限于此,本申请实施例对显示屏703的位置、大小不作具体限定。掌部交互设备在基于摄像头702获取掌部图像,并确定掌部图像对应的对象标识后,在显示屏703中显示掌部对应的掌部标识705,掌部标识705在显示屏703中的移动轨迹用以表示掌部在掌部交互设备外的悬空区域的偏移挥扫动作。
例如,掌部在掌部交互设备外的悬空区域从左往右挥扫时,掌部标识705在显示屏703中同样从左往右移动,使得用户能够实时知道掌部相对于掌部交互设备的位置。
可选地,第一响应操作包括调节显示比例操作。示例性地,如图8所示出的偏移挥扫动 作对应的调节显示比例操作示意图。显示屏上包括滑动控件,滑动控件包括滑轨802和滑标801。掌部交互设备在基于摄像头获取掌部图像,并确定掌部图像对应的对象标识后,在显示屏中显示掌部对应的掌部标识803,掌部相对于掌部交互设备的位置发生变化时,在显示屏中的掌部标识803相对发生移动。在掌部标识803移动至滑标801位置时,滑标801所在的局部区域的显示比例从第一比例放大至第二比例,第二比例大于第一比例。通过移动掌部标识803,掌部标识803控制滑标801在滑轨802上移动,从而实现环境参数的调节显示比例操作。
例如,在掌部标识803移动至滑标801位置时,滑标801所在的局部区域的显示比例放大1.5倍,通过移动掌部标识803,掌部标识803控制滑标801在滑轨802上移动,掌部在掌部交互设备外的悬空区域从左往右挥扫时,环境参数对应的滑轨802上的滑标801同样从左往右调节。
步骤408:执行远近拍击动作的第二响应操作。
示例性地,第二响应操作包括确定操作、退出操作、数量增加操作、数量减少操作中的至少一种,但不限于此,本申请实施例对此不作具体限定。
示例性地,第二响应操作包括选择操作。掌部交互设备基于远近拍击动作的动作参数值,确定出远近拍击动作所属的选择操作的类别;基于选择操作的类别执行选择操作。
可选地,选择操作包括确定操作、退出操作中的至少一种,但不限于此。
示例性地,在掌部同时进入至少两个接近传感器的第一测量区域内时,将掌部进入第一测量区域的时间点确定为第一起始时间点;在第一起始时间点后的第一时间段内,至少两个接近传感器测得的距离测量值同时减小,且在第一时间段后的第二时间段内,至少两个接近传感器测得的距离测量值同时增大或保持不变的情况下,确定出远近拍击动作对应的选择操作为确定操作,第一时间段大于第二时间段。
例如,在对掌部交互设备进行系统设置的场景下,操作者可通过远近拍击动作实现对系统界面的选择操作。在掌部同时进入至少两个接近传感器的第一测量区域内时,确定第一起始时间点T1;在第一起始时间点T1后的第一时间段T2内,至少两个接近传感器测得的距离测量值同时减小,且在第一时间段T2后的第二时间段T3内,至少两个接近传感器测得的距离测量值同时增大或保持不变的情况下,则,确定出远近拍击动作对应的选择操作为确定操作。
示例性地,如图9所示出的远近拍击动作对应的确定操作示意图。在掌部交互设备903中设置有四个接近传感器901和显示屏902,四个接近传感器901以左上、左下、右上、右下位置矩形排布。在对掌部交互设备903进行系统设置的场景下,在掌部同时进入至少三个接近传感器901的第一测量区域内时,确定第一起始时间点T1;在第一起始时间点T1后的第一时间段T2内,至少三个接近传感器901测得的距离测量值同时减小,且在第一时间段T2后的第二时间段T3内,至少三个接近传感器901测得的距离测量值同时增大或保持不变的情况下,即,在时间段T1+T2内,掌部向接近掌部交互设备903的方向移动,在时间段(T1+T2)至(T1+T2+T3)内,掌部向远离掌部交互设备903的方向移动或保持不动,则,掌部交互设备确定出远近拍击动作对应的选择操作为确定操作,也可称为隔空按击操作。
示例性地,在掌部同时进入至少两个接近传感器的第一测量区域内时,将掌部同时进入第一测量区域的时间确定为第二起始时间点;在第二起始时间点后的第一时间段内,至少两个接近传感器测得的距离测量值同时增大的情况下,确定出远近拍击动作对应的选择操作为退出操作。
可选地,在掌部同时进入至少两个接近传感器的第一测量区域内时,将掌部同时进入第一测量区域的时间确定为第二起始时间点;在第二起始时间点后的第一时间段内,至少两个接近传感器测得的距离测量值同时增大,且在第一时间段后的第二时间段内,至少两个接近传感器测得的距离测量值同时减小或保持不变的情况下,确定出远近拍击动作对应的选择操 作为退出操作。
例如,在对掌部交互设备进行系统设置的场景下,操作者可通过远近拍击动作实现对系统界面的退出操作。在掌部同时进入至少两个接近传感器的第一测量区域内时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,在至少两个接近传感器测得的距离测量值同时增大的情况下,确定出远近拍击动作对应的选择操作为退出操作。
可选地,在掌部同时进入至少两个接近传感器的第一测量区域内时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,至少两个接近传感器测得的距离测量值同时增大,且在第一时间段后的第二时间段t3内,至少两个接近传感器测得的距离测量值同时减小或保持不变的情况下,确定出远近拍击动作对应的选择操作为退出操作。
示例性地,如图10所示出的远近拍击动作对应的退出操作的示意图。在掌部交互设备中设置有四个接近传感器1001和显示屏1002,四个接近传感器1001以左上、左下、右上、右下位置矩形排布。在对掌部交互设备1003进行系统设置的场景下,在掌部同时位于至少三个接近传感器1001的第一测量区域内的前半部分(靠近掌部交互设备部分)时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,至少三个接近传感器1001测得的距离测量值同时增大的情况下,即,在时间段t1+t2内,掌部向远离掌部交互设备1003的方向移动,则,掌部交互设备确定出远近拍击动作对应的选择操作为退出操作,也可称为隔空回退操作。
进一步的,在掌部同时位于至少三个接近传感器1001的第一测量区域内的前半部分时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,至少三个接近传感器1001测得的距离测量值同时增大,且在第一时间段t2后的第二时间段t3内,至少三个接近传感器1001测得的距离测量值同时减小或保持不变的情况下,即,在时间段t1+t2内,掌部向远离掌部交互设备1003的方向移动,在时间段(t1+t2)至(t1+t2+t3)内,掌部向接近掌部交互设备的方向移动或保持不动,则,掌部交互设备确定出远近拍击动作对应的选择操作为退出操作,也可称为隔空回退操作。
可选地,第二响应操作包括数量增、减操作。示例性地,如图11所示出的远近拍击动作对应的数量增、减操作的示意图。在选择点菜场景中,在显示屏1103上显示有菜品标识1102。掌部交互设备1104在基于摄像头获取掌部图像,并确定掌部图像对应的对象标识后,在显示屏1103中显示掌部对应的掌部标识1101,掌部相对于掌部交互设备1104的位置发生变化时,在显示屏中的掌部标识1101相对发生移动。在掌部标识1101移动至菜品标识1102位置且在菜品标识1102位置停留3s时,选中该菜品标识1102,通过远近拍击动作实现对选中菜品标识1102的份数的增、减。
例如,在掌部同时进入至少两个接近传感器的第一测量区域内时,确定第一起始时间点T1;在第一起始时间点T1后的第一时间段T2内,至少两个接近传感器测得的距离测量值同时减小,且在第一时间段T2后的第二时间段T3内,至少两个接近传感器测得的距离测量值同时增大或保持不变的情况下,则,确定出远近拍击动作对应的选择操作为数量增加操作。
在掌部同时进入至少两个接近传感器的第一测量区域内时,确定第二起始时间点t1;在第二起始时间点t1后的第一时间段t2内,至少两个接近传感器测得的距离测量值同时增大,且在第一时间段后的第二时间段t3内,至少两个接近传感器测得的距离测量值同时减小或保持不变的情况下,确定出远近拍击动作对应的选择操作为数量减少操作。
在一种可能的实现方式中,在获取至少两个接近传感器采集的传感器数据之前,还包括:掌部交互设备通过摄像头获取掌部的掌部图像;基于掌部图像确定掌部的对象标识;在确定对象标识且掌部在接近传感器的第二测量区域停留时间大于停留时间阈值的情况下,进入交互模式。
示例性地,如图12所示出的掌部交互设备进入交互模式的示意图。在掌部交互设备中设置有四个接近传感器1201、显示屏1202和摄像头1203,四个接近传感器1201以左上、左下、 右上、右下位置矩形排布。掌部交互设备通过摄像头1203获取掌部的掌部图像;掌部交互设备基于掌部图像确定掌部的对象标识;在确定对象标识且掌部在接近传感器1201的第二测量区域停留时间大于停留时间阈值的情况下,进入交互模式,在显示屏1202上显示设置界面。
例如,设置停留时间阈值为15s,在掌部交互设备通过摄像头1203确定对象标识,且掌部在距接近传感器1201的3cm的区域内的停留时间大于15s的情况下,掌部交互设备进入交互模式,在显示屏1202上显示设置界面。
掌部图像为待确定目标对象标识的掌部图像,该掌部图像中包含手掌,该手掌为待验证身份的目标对象的手掌,该掌部图像还可以包含其他的信息,如目标对象的手指、拍摄目标对象手掌时所处的场景等。
示例性地,如图13所示出的手掌中手指缝点的示意图,手指缝点为食指与中指之间的第一手指缝点1301,或,手指缝点为中指与无名指之间的第二手指缝点1302,或,手指缝点为无名指与小指之间的第三手指缝点1303。
由于在掌部图像中的掌部区域可能存在于该掌部图像中的任一区域,为了能够确定掌部区域在该掌部图像中的位置,通过对该掌部图像进行手指缝点检测,从而得到掌部图像的至少一个手指缝点,以便后续能够根据该至少一个手指缝点,确定掌部区域。
示例性地,掌部交互设备对对象的手掌进行拍摄,得到掌部图像。其中,该掌部图像中包含该手掌,该手掌可以为对象的左手掌,也可以为对象的右手掌。例如,该掌部交互设备为物联网设备,该物联网设备通过摄像头拍摄对象的左手掌,得到掌部图像。
示例性地,摄像头包括彩色摄像头和红外摄像头。掌部交互设备通过彩色摄像头获取掌部的彩色图像;通过红外摄像头获取同一掌部的红外图像;掌部交互设备基于彩色图像和红外图像进行掌部的识别处理,确定掌部图像对应的对象标识,也即确定掌部的对象标识。
彩色图像是指彩色相机基于自然光对掌部成像所得到的图像。
红外图像是指红外相机基于红外光对掌部成像所得到的图像。
可选地,掌部交互设备还基于彩色图像和红外图像进行掌部对比识别处理,以确定掌部的对象标识。其中,掌部对比识别处理是指将掌部区域的特征与数据库中的预设掌部特征进行对比识别。
预设掌部特征为存储的对象标识所对应的掌部的掌部特征,每个预设掌部特征具有对应的对象标识,表示该预设掌部特征属于该对象标识,是该对象掌部的掌部特征。该对象标识可以为任意的对象标识,如,该对象标识为支付应用中注册的对象标识,或,该对象标识为企业中登记的对象标识。
在本申请实施例中,掌部交互设备中包括数据库,该数据库中包括多个预设掌部特征,及每个预设掌部特征对应的对象标识。在该数据库中,预设掌部特征与对象标识可以是一一对应,也可以是一个对象标识对应至少两个预设掌部特征。
例如,多个对象在掌部交互设备中进行注册,通过将每个对象的预设掌部特征与对应的对象标识进行绑定,将多个对象的掌部特征与对应的对象标识对应存储于数据库中,后续对象使用掌部交互设备时,通过对红外图像、彩色图像与数据库中的预设掌部特征进行掌部对比识别处理,来确定对象标识,实现对对象的身份验证。
综上所述,本实施例提供的方法,通过掌部交互设备上阵列设置的至少两个接近传感器采集传感器数据,并基于传感器数据识别掌部的偏移挥扫动作和远近拍击动作,根据偏移挥扫动作和远近拍击动作,控制掌部交互设备执行切换操作、确定操作、退出操作。本申请提供了一种新的交互方式,通过接近传感器识别掌部的悬空交互操作,从而无需触摸或任何物理性地接触便可实现用户对掌部交互设备的控制,提高了交互效率。
需要说明的是,掌部作为生物特征的一种,具有生物唯一性与区分性。相对于目前被广泛应用于核身、支付、门禁、乘车等领域的面部识别,掌部不会受化妆、口罩、墨镜等影响,可以提高对象身份验证的准确率。在某些场景下,如疫情防控场景下,需要佩戴口罩遮住口 鼻,这种情况下使用掌部图像进行身份验证可以作为一种更好的选择。
图14是本申请一个示例性实施例提供的基于掌部的人机交互方法的跨设备悬空交互的示意图。该方法涉及第一掌部交互设备1401、第二掌部交互设备1403及应用服务器1402。
其中,第一掌部交互设备1401安装有应用程序,例如,支付应用,第一掌部交互设备1401基于对象标识登录支付应用,与应用服务器1402建立通信连接,通过该通信连接,第一掌部交互设备1401与应用服务器1402可以进行交互,对象可通过悬空交互动作对第一掌部交互设备1401进行控制,比如,进行支付应用设置;第二掌部交互设备1403均安装有支付应用,第二掌部交互设备1403基于商户标识登录支付应用,与应用服务器1402建立通信连接,通过该通信连接,第二掌部交互设备1403与应用服务器1402可以进行交互,对象可通过悬空交互动作对第二掌部交互设备1403进行控制,比如,支付应用设置。
该跨设备悬空交互动流程包括:
1、对象在家中手持第一掌部交互设备1401,通过该第一掌部交互设备1401拍摄对象自己的手掌,得到该对象的掌部图像,基于对象标识登录支付应用,向应用服务器1402发送掌部图像注册请求,该掌部图像注册请求携带该对象标识及掌部图像。
2、应用服务器1402接收到第一掌部交互设备1401发送的掌部图像注册请求,对掌部图像进行处理,得到该掌部图像的掌部特征,将该掌部特征与该对象标识进行对应存储,向第一掌部交互设备1401发送掌部图像绑定成功通知。
其中,应用服务器1402将掌部特征与对象标识进行对应存储后,将该掌部特征作为预设掌部特征,后续可以通过存储的预设掌部特征,来确定对应的对象标识。
3、第一掌部交互设备1401接收到掌部图像绑定成功通知,显示该掌部图像绑定成功通知,提示对象掌部图像与对象标识绑定。
其中,对象通过自己的第一掌部交互设备1401与应用服务器1402之间的交互,完成掌部图像注册,后续可以通过掌部图像快速实现识别,从而控制第一掌部交互设备1401进入人机交互模式,通过悬空交互动作对第一掌部交互设备1401进行控制。
4、对象在其他地方与第二掌部交互设备1403进行人机交互时,第二掌部交互设备1403拍摄该对象的手掌,得到掌部图像,基于第二掌部交互设备1403中登录的支付应用,向应用服务器1402发送支付应用设置请求,该支付应用设置请求携带该第二掌部交互设备1403的标识、悬空交互动作及掌部图像。
5、应用服务器1402接收到支付应用设置请求后,对掌部图像进行掌部对比识别处理,确定该掌部图像的对象标识,基于悬空交互动作完成支付应用设置,在支付应用设置完成后,向第二掌部交互设备1403发送支付应用设置完成通知。
其中,对象在利用第一掌部交互设备1401进行掌部图像注册后,可以直接对第二掌部交互设备1403进行支付应用设置,无需对象在第二掌部交互设备1403上进行掌部图像注册,从而实现了跨设备基于掌部的人机交互的效果,提高了便捷性。
6、第二掌部交互设备1403接收到支付应用设置完成通知后,显示该支付应用设置完成通知,提示对象支付应用设置完成,以使对象知悉第二掌部交互设备1403已设置完成。
示意性的,本申请实施例提供的基于掌部的人机交互功能仅为掌部交互设备中的一种功能,掌部交互设备的应用场景包括但不限于以下场景:
例如,掌部图像识别支付场景下:
商户的掌部交互设备通过拍摄对象的手掌,获取到该对象的掌部图像,采用本申请实施例提供的基于掌部的人机交互方法,确定该掌部图像的目标对象标识,将该目标对象标识对应的资源账户中的部分资源,转入到商户资源账户中,实现通过掌部自动支付。
又如,跨设备支付场景下:
对象可以在家或其他私密空间使用个人手机完成身份注册,将该对象的账号与该对象的 掌部图像进行绑定,之后可以到店内设备上对该对象的掌部图像进行识别,确定该对象的账号,通过该账号直接支付。
再例如,上班打卡场景下:
掌部交互设备通过拍摄对象的手掌,获取到该对象的掌部图像,采用本申请实施例提供的基于掌部的人机交互方法,确定该掌部图像的目标对象标识,为该目标用对象标识建立打卡标记,确定该目标对象标识在当前时间已完成上班打卡。
当然,除了应用于上述场景外,本申请实施例提供方法还可以应用于其他需要基于掌部的人机交互的场景,本申请实施例并不对具体的应用场景进行限定。
图15示出了本申请一个示例性实施例提供的掌部交互装置的结构示意图。该装置可以通过软件、硬件或者两者的结合实现成为掌部交互设备的全部或一部分,该装置包括:
获取模块1501,用于获取至少两个接近传感器采集的掌部的传感器数据,至少两个接近传感器在掌部交互设备上阵列设置;
识别模块1502,用于通过至少两个接近传感器采集的掌部的传感器数据,识别掌部的悬空交互动作;
控制模块1503,用于执行悬空交互动作的响应操作。
在一种可能的实现方式中,悬空交互动作包括偏移挥扫动作和远近拍击动作中的至少一种;
控制模块1503,还用于执行偏移挥扫动作的第一响应操作;
控制模块1503,还用于执行远近拍击动作的第二响应操作。
在一种可能的实现方式中,控制模块1503,还用于通过传感器数据中偏移挥扫动作的动作参数值,确定出偏移挥扫动作的挥扫方向;基于挥扫方向,控制掌部交互设备执行第一响应操作。
在一种可能的实现方式中,控制模块1503,还用于响应于掌部进入第一位置区域,将掌部在第一位置区域的位置确定为起点;响应于掌部从第一位置区域移入第二位置区域,将掌部在第二位置区域的位置确定为终点;在掌部从起点移动至终点的时间小于第一时间阈值的情况下,将起点指向终点的方向确定为挥扫方向。
在一种可能的实现方式中,控制模块1503,还用于通过传感器数据中远近拍击动作的动作参数值,确定出远近拍击动作的操作类别;基于操作类别控制掌部交互设备执行第二响应操作。
在一种可能的实现方式中,控制模块1503,还用于响应于掌部进入至少两个接近传感器的第一测量区域内,将掌部进入第一测量区域的时间点确定为第一起始时间点;响应于至少两个第一接近传感器测得的距离测量值在第一时间段内同时减小、且在第二时间段内同时增大或保持不变,确定出远近拍击动作的操作类别为按击操作,其中,第一接近传感器包含于至少两个接近传感器中,第一时间段是以第一起始时间点为始的时间段,第二时间段是以第一时间段的结束时间点为始的时间段,第一时间段大于第二时间段。
在一种可能的实现方式中,控制模块1503,还用于响应于掌部进入至少两个接近传感器的第一测量区域内,将掌部进入第一测量区域的时间点确定为第二起始时间点;响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大,确定出远近拍击动作的操作类别为回退操作,其中,第二接近传感器包含于至少两个接近传感器中,第一时间段是以第二起始时间点为始的时间段。
在一种可能的实现方式中,控制模块1503,还用于响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大、且在第二时间段内同时减小或保持不变,确定出远近拍击动作的操作类别为回退操作,其中,第二时间段是以第一时间段的结束时间点为始的时间段,第一时间段大于第二时间段。
在一种可能的实现方式中,获取模块1501,还用于通过摄像头获取掌部的掌部图像;
识别模块1502,还用于基于掌部图像确定掌部的对象标识;
控制模块1503,还用于在确定对象标识且掌部在接近传感器的第二测量区域停留时间大于停留时间阈值的情况下,进入交互模式。
在一种可能的实现方式中,获取模块1501,还用于通过彩色摄像头获取掌部的彩色图像,彩色图像是指彩色相机基于自然光对掌部成像所得到的图像;
获取模块1501,还用于通过红外摄像头获取同一掌部的红外图像,红外图像是指红外相机基于红外光对掌部成像所得到的图像。
识别模块1502,还用于基于彩色图像和红外图像进行掌部识别处理,确定掌部的对象标识。
在一种可能的实现方式中,显示模块1504,用于在显示屏上显示偏移挥扫动作的第一响应操作;
显示模块1504,用于在显示屏上显示远近拍击动作的第二响应操作。
图16示出了本申请一示例性实施例示出的掌部交互设备1600的结构框图。该掌部交互设备1600可以是智能手机、平板电脑、台式电脑、智能手表、机器人、MP3(Moving Picture Experts Group Audio Layer III,动态影像专家压缩标准音频层面3)播放器、MP4(Moving Picture Experts Group Audio Layer IV,动态影像专家压缩标准音频层面4)播放器等。
该掌部交互设备1600包括有:处理器1601和存储器1602。
处理器1601可以包括一个或多个处理核心,比如4核心处理器、8核心处理器等。处理器1601可以采用DSP(Digital Signal Processing,数字信号处理)、FPGA(Field-Programmable Gate Array,现场可编程门阵列)、PLA(Programmable Logic Array,可编程逻辑阵列)中的至少一种硬件形式来实现。处理器1601也可以包括主处理器和协处理器,主处理器是用于对在唤醒状态下的数据进行处理的处理器,也称CPU(Central Processing Unit,中央处理器);协处理器是用于对在待机状态下的数据进行处理的低功耗处理器。在一些实施例中,处理器1601可以在集成有GPU(Graphics Processing Unit,图像处理器),GPU用于负责显示屏所需要显示的内容的渲染和绘制。一些实施例中,处理器1601还可以包括AI(Artificial Intelligence,人工智能)处理器,该AI处理器用于处理有关机器学习的计算操作。
存储器1602可以包括一个或多个计算机可读存储介质,该计算机可读存储介质可以是有形的和非暂态的。存储器1602还可包括高速随机存取存储器,以及非易失性存储器,比如一个或多个磁盘存储设备、闪存存储设备。在一些实施例中,存储器1602中的非暂态的计算机可读存储介质用于存储至少一个指令,该至少一个指令用于被处理器1601所执行以实现本申请中提供的基于掌部的人机交互方法。
在一些实施例中,掌部交互设备1600还可选包括有:外围设备接口1603和至少一个外围设备。具体地,外围设备包括:显示屏1604、摄像头组件1605中的至少一种。
外围设备接口1603可被用于将I/O(Input/Output,输入/输出)相关的至少一个外围设备连接到处理器1601和存储器1602。在一些实施例中,处理器1601、存储器1602和外围设备接口1603被集成在同一芯片或电路板上;在一些其他实施例中,处理器1601、存储器1602和外围设备接口1603中的任意一个或两个可以在单独的芯片或电路板上实现,本实施例对此不加以限定。
显示屏1604用于显示UI(User Interface,用户界面)。该UI可以包括图形、文本、图标、视频及其它们的任意组合。可选地,显示屏1604可以是触摸显示屏,触摸显示屏还具有采集在触摸显示屏的表面或表面上方的触摸信号的能力。该触摸信号可以作为控制信号输入至处理器1601进行处理。触摸显示屏用于提供虚拟按钮和/或虚拟键盘,也称软按钮和/或软键盘。在一些实施例中,显示屏1604可以为一个,设置掌部交互设备1600的前面板;在另一些实施例中,显示屏1604可以为至少两个,分别设置在掌部交互设备1600的不同表面或呈折叠设计;在另一些实施例中,显示屏1604可以是柔性显示屏,设置在掌部交互设备1600的弯 曲表面上或折叠面上。甚至,显示屏1604还可以设置成非矩形的不规则图形,也即异形屏。显示屏1604可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等材质制备。
摄像头组件1605用于采集图像或视频。可选地,摄像头组件1605包括前置摄像头和后置摄像头。通常,前置摄像头用于实现视频通话或自拍,后置摄像头用于实现照片或视频的拍摄。在一些实施例中,后置摄像头为至少两个,分别为主摄像头、景深摄像头、广角摄像头中的任意一种,以实现主摄像头和景深摄像头融合实现背景虚化功能,主摄像头和广角摄像头融合实现全景拍摄以及VR(Virtual Reality,虚拟现实)拍摄功能。在一些实施例中,摄像头组件1605还可以包括闪光灯。闪光灯可以是单色温闪光灯,也可以是双色温闪光灯。双色温闪光灯是指暖光闪光灯和冷光闪光灯的组合,可以用于不同色温下的光线补偿。在另一些实施例中,摄像头可以包括彩色摄像头和红外摄像头。
在一些实施例中,掌部交互设备1600还包括有一个或多个传感器1606。该一个或多个传感器1606包括但不限于:接近传感器1607。
接近传感器1607,也称距离传感器,通常设置在掌部交互设备1600的正面。接近传感器1607用于采集用户与掌部交互设备1600的正面之间的距离。在一个实施例中,掌部交互设备1600上阵列设置有至少两个接近传感器1607。比如,在有两个接近传感器的情况下,两个接近传感器以左上、右下的位置排布;或者,两个接近传感器以左下、右上的位置排布;或者,两个接近传感器以上下的位置排布;或者,两个接近传感器以左右的位置排布。示例性地,如图17所示,2个接近传感器分别为左接近传感器1701和右接近传感器1702。
又如,在有三个接近传感器的情况下,三个接近传感器以品字形的位置排布;三个接近传感器以左上、左下、右上的位置排布;或者,三个接近传感器以左上、右上、右下的位置排布;或者,三个接近传感器以左上、左下、右下的位置排布;或者,三个接近传感器以左下、右上、右下的位置排布。示例性地,如图18所示,3个接近传感器分别为上传感器1801、左下传感器1802和右下传感器1803。比如,以品字形排布3个接近传感器的情形,左右方向上的偏移挥扫动作由左下接近传感器和右下接近传感器来识别,上下方向上的偏移挥扫动作由上接近传感器、左下接近传感器和右下接近传感器来识别。
又如,在有四个接近传感器的情况下,四个接近传感器以左上、左下、右上、右下位置矩形排布;或,四个接近传感器以上、下、左、右位置菱形排布。示例性地,如图1所示,4个接近传感器分别为左上接近传感器101、左下接近传感器102、右上接近传感器103、右下接近传感器104。
需要说明的是,接近传感器的排布但不限于上述举例,本申请实施例对此不作具体限定。
本领域技术人员可以理解,图16中示出的结构并不构成对计算机设备1600的限定,可以包括比图示更多或更少的组件,或者组合某些组件,或者采用不同的组件布置。
本申请实施例还提供一种掌部交互设备,该掌部交互设备包括处理器、存储器和至少两个接近传感器;至少两个接近传感器采集掌部的传感器数据,且将上述掌部的传感器数据存储于存储器中;该存储器中存储有至少一条程序,该至少一条程序由处理器加载并执行以实现上述各方法实施例提供的基于掌部的人机交互方法。
本申请实施例还提供一种计算机可读存储介质,该存储介质中存储有至少一条计算机程序,该至少一条计算机程序由处理器加载并执行以实现上述各方法实施例提供的基于掌部的人机交互方法。
本申请实施例还提供一种计算机程序产品,所述计算机程序产品包括计算机程序,所述计算机程序存储在计算机可读存储介质中;所述计算机程序由掌部交互设备的处理器从所述计算机可读存储介质读取并执行,使得所述掌部交互设备执行以实现上述各方法实施例提供的基于掌部的人机交互方法。

Claims (20)

  1. 一种基于掌部的人机交互方法,所述方法应用于掌部交互设备,所述方法包括:
    获取至少两个接近传感器采集的所述掌部的传感器数据,所述至少两个接近传感器在所述掌部交互设备上阵列设置;
    通过所述至少两个接近传感器采集的所述掌部的传感器数据,识别所述掌部的悬空交互动作;
    执行所述悬空交互动作的响应操作。
  2. 根据权利要求1所述的方法,所述悬空交互动作包括偏移挥扫动作和远近拍击动作中的至少一种;
    所述执行所述悬空交互动作的响应操作,包括:
    执行所述偏移挥扫动作的第一响应操作;
    执行所述远近拍击动作的第二响应操作。
  3. 根据权利要求2所述的方法,所述执行所述偏移挥扫动作的第一响应操作,包括:
    通过所述传感器数据中所述偏移挥扫动作的动作参数值,确定出所述偏移挥扫动作的挥扫方向;
    基于所述挥扫方向,执行所述第一响应操作。
  4. 根据权利要求3所述的方法,所述通过所述传感器数据中所述偏移挥扫动作的动作参数值,确定出所述偏移挥扫动作的挥扫方向,包括:
    响应于所述掌部进入第一位置区域,将所述掌部在所述第一位置区域的位置确定为起点;
    响应于所述掌部从所述第一位置区域移入第二位置区域,将所述掌部在所述第二位置区域的位置确定为终点;
    在所述掌部从所述起点移动至所述终点的时间小于第一时间阈值的情况下,将所述起点指向所述终点的方向确定为所述挥扫方向。
  5. 根据权利要求2至4任一所述的方法,所述执行所述远近拍击动作的第二响应操作,包括:
    通过所述传感器数据中所述远近拍击动作的动作参数值,确定出所述远近拍击动作的操作类别;
    基于所述操作类别,执行所述第二响应操作。
  6. 根据权利要求5所述的方法,所述通过所述传感器数据中所述远近拍击动作的动作参数值,确定出所述远近拍击动作的操作类别,包括:
    响应于所述掌部进入所述至少两个接近传感器的第一测量区域内,将所述掌部进入所述第一测量区域的时间点确定为第一起始时间点;
    响应于至少两个第一接近传感器测得的距离测量值在第一时间段内同时减小、且在第二时间段内同时增大或保持不变,确定出所述远近拍击动作的操作类别为按击操作,其中,所述第一接近传感器包含于所述至少两个接近传感器中,所述第一时间段是以所述第一起始时间点为始的时间段,所述第二时间段是以所述第一时间段的结束时间点为始的时间段,所述第一时间段大于所述第二时间段。
  7. 根据权利要求5所述的方法,所述通过所述传感器数据中所述远近拍击动作的动作参 数值,确定出所述远近拍击动作的操作类别,包括:
    响应于所述掌部进入所述至少两个接近传感器的第一测量区域内,将所述掌部进入所述第一测量区域的时间点确定为第二起始时间点;
    响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大,确定出所述远近拍击动作的操作类别为回退操作,其中,所述第二接近传感器包含于所述至少两个接近传感器中,所述第一时间段是以所述第二起始时间点为始的时间段。
  8. 根据权利要求7所述的方法,所述响应于至少两个第二接近传感器测得的距离测量值在第一时间段内同时增大,确定出所述远近拍击动作的操作类别为回退操作,包括:
    响应于所述至少两个第二接近传感器测得的距离测量值在所述第一时间段内同时增大、且在第二时间段内同时减小或保持不变,确定出所述远近拍击动作的操作类别为所述回退操作,其中,所述第二时间段是以所述第一时间段的结束时间点为始的时间段,所述第一时间段大于所述第二时间段。
  9. 根据权利要求1至8任一所述的方法,所述掌部交互设备还包括摄像头;所述方法还包括:
    通过所述摄像头获取所述掌部的掌部图像;
    基于所述掌部图像确定所述掌部的对象标识;
    在确定所述对象标识且所述掌部在所述接近传感器的第二测量区域停留时间大于停留时间阈值的情况下,进入交互模式。
  10. 根据权利要求9所述的方法,所述摄像头包括彩色摄像头和红外摄像头;
    所述通过所述摄像头获取所述掌部的掌部图像,包括:
    通过所述彩色摄像头获取所述掌部的彩色图像,所述彩色图像是指彩色相机基于自然光对所述掌部成像所得到的图像;
    通过所述红外摄像头获取同一所述掌部的红外图像,所述红外图像是指红外相机基于红外光对所述掌部成像所得到的图像;
    所述基于所述掌部图像确定所述掌部的对象标识,包括:
    基于所述彩色图像和所述红外图像进行掌部的识别处理,确定所述掌部的所述对象标识。
  11. 根据权利要求2至8任一所述的方法,所述掌部交互设备还包括显示屏;所述方法还包括:
    在所述显示屏上显示所述偏移挥扫动作的所述第一响应操作;
    在所述显示屏上显示所述远近拍击动作的所述第二响应操作。
  12. 一种掌部交互装置,所述装置包括:
    获取模块,用于获取至少两个接近传感器采集的掌部的传感器数据,所述至少两个接近传感器在掌部交互设备上阵列设置;
    识别模块,用于通过所述至少两个接近传感器采集的所述掌部的传感器数据,识别掌部的悬空交互动作;
    控制模块,用于执行所述悬空交互动作的响应操作。
  13. 根据权利要求12所述的装置,所述悬空交互动作包括偏移挥扫动作和远近拍击动作中的至少一种;
    所述控制模块,用于执行所述偏移挥扫动作的第一响应操作;
    所述控制模块,用于执行所述远近拍击动作的第二响应操作。
  14. 根据权利要求13所述的装置,所述控制模块,用于:
    通过所述传感器数据中所述偏移挥扫动作的动作参数值,确定出所述偏移挥扫动作的挥扫方向;
    基于所述挥扫方向,执行所述第一响应操作。
  15. 根据权利要求13或14所述的装置,所述控制模块,用于:
    通过所述传感器数据中所述远近拍击动作的动作参数值,确定出所述远近拍击动作的操作类别;
    基于所述操作类别,执行所述第二响应操作。
  16. 根据权利要求12至15任一所述的装置,所述掌部交互设备还包括摄像头;
    所述获取模块,还用于通过所述摄像头获取所述掌部的掌部图像;
    所述识别模块,还用于基于所述掌部图像确定所述掌部的对象标识;
    所述控制模块,还用于在确定所述对象标识且所述掌部在所述接近传感器的第二测量区域停留时间大于停留时间阈值的情况下,进入交互模式。
  17. 根据权利要求13至15任一所述的装置,所述掌部交互设备还包括显示屏;所述装置还包括:
    显示模块,用于在所述显示屏上显示所述偏移挥扫动作的所述第一响应操作;
    所述显示模块,用于在所述显示屏上显示所述远近拍击动作的所述第二响应操作。
  18. 一种掌部交互设备,所述掌部交互设备包括:处理器、存储器和至少两个接近传感器;所述至少两个接近传感器采集掌部的传感器数据,且将所述掌部的传感器数据存储于所述存储器中;所述存储器中还存储有至少一条计算机程序,所述至少一条计算机程序由所述处理器加载并执行以实现如权利要求1至11中任一项所述的基于掌部的人机交互方法。
  19. 一种计算机存储介质,所述计算机可读存储介质中存储有至少一条计算机程序,所述至少一条计算机程序由处理器加载并执行以实现如权利要求1至11中任一项所述的基于掌部的人机交互方法。
  20. 一种计算机程序产品,所述计算机程序产品包括计算机程序,所述计算机程序存储在计算机可读存储介质中;所述计算机程序由掌部交互设备的处理器从所述计算机可读存储介质读取并执行,使得所述掌部交互设备执行如权利要求1至11中任一项所述的基于掌部的人机交互方法。
PCT/CN2023/117199 2022-09-29 2023-09-06 基于掌部的人机交互方法、装置、设备、介质及程序产品 WO2024066977A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211196351.5 2022-09-29
CN202211196351.5A CN117826978A (zh) 2022-09-29 2022-09-29 基于掌部的人机交互方法、装置、设备、介质及程序产品

Publications (1)

Publication Number Publication Date
WO2024066977A1 true WO2024066977A1 (zh) 2024-04-04

Family

ID=90476003

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/117199 WO2024066977A1 (zh) 2022-09-29 2023-09-06 基于掌部的人机交互方法、装置、设备、介质及程序产品

Country Status (2)

Country Link
CN (1) CN117826978A (zh)
WO (1) WO2024066977A1 (zh)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20090139778A1 (en) * 2007-11-30 2009-06-04 Microsoft Corporation User Input Using Proximity Sensing
WO2014085277A1 (en) * 2012-11-27 2014-06-05 Neonöde Inc. Light-based touch controls on a steering wheel and dashboard
US9602806B1 (en) * 2013-06-10 2017-03-21 Amazon Technologies, Inc. Stereo camera calibration using proximity data
US20180059784A1 (en) * 2016-08-23 2018-03-01 International Business Machines Corporation Remote Control Via Proximity Data
CN113515987A (zh) * 2020-07-09 2021-10-19 腾讯科技(深圳)有限公司 掌纹识别方法、装置、计算机设备及存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080168403A1 (en) * 2007-01-06 2008-07-10 Appl Inc. Detecting and interpreting real-world and security gestures on touch and hover sensitive devices
US20090139778A1 (en) * 2007-11-30 2009-06-04 Microsoft Corporation User Input Using Proximity Sensing
WO2014085277A1 (en) * 2012-11-27 2014-06-05 Neonöde Inc. Light-based touch controls on a steering wheel and dashboard
US9602806B1 (en) * 2013-06-10 2017-03-21 Amazon Technologies, Inc. Stereo camera calibration using proximity data
US20180059784A1 (en) * 2016-08-23 2018-03-01 International Business Machines Corporation Remote Control Via Proximity Data
CN113515987A (zh) * 2020-07-09 2021-10-19 腾讯科技(深圳)有限公司 掌纹识别方法、装置、计算机设备及存储介质

Also Published As

Publication number Publication date
CN117826978A (zh) 2024-04-05

Similar Documents

Publication Publication Date Title
US10606609B2 (en) Context-based discovery of applications
JP6684883B2 (ja) カメラエフェクトを提供する方法およびシステム
EP2864932B1 (en) Fingertip location for gesture input
CN103336575B (zh) 一种人机交互的智能眼镜系统及交互方法
TW201814438A (zh) 基於虛擬實境場景的輸入方法及裝置
US10191612B2 (en) Three-dimensional virtualization
CN108885521A (zh) 跨环境共享
CN104081307A (zh) 图像处理装置、图像处理方法和程序
WO2019214442A1 (zh) 一种设备控制方法、装置、控制设备及存储介质
CN113014863A (zh) 认证用户的方法及系统以及计算机可读记录介质
KR20190101827A (ko) 디스플레이를 통해 표시된 제 1 콘텐트에 대해 제 2 콘텐트를 외부 객체의 움직임에 따라 제공하기 위한 전자 장치 및 그의 동작 방법
US9400575B1 (en) Finger detection for element selection
US20230195277A1 (en) Content network storing content uniquely identifiable and accessible by location/time coordinates
Chen et al. A case study of security and privacy threats from augmented reality (ar)
CN112486394A (zh) 信息处理方法、装置、电子设备及可读存储介质
CN113301506A (zh) 信息共享方法、装置、电子设备及介质
CN106909219B (zh) 基于三维空间的交互控制方法和装置、智能终端
KR102077665B1 (ko) 혼합 현실에서의 가상 모바일 단말 구현 시스템 및 이의 제어 방법
CN115702443A (zh) 将存储的数字化妆增强应用于数字图像中的已辨识面部
WO2024066977A1 (zh) 基于掌部的人机交互方法、装置、设备、介质及程序产品
US10895913B1 (en) Input control for augmented reality applications
KR20200076273A (ko) 얼굴 이미지와 메신저 계정의 연동에 기반한 이미지 관리 방법 및 시스템
KR102338627B1 (ko) 증강 현실 제공 시스템 및 방법
Bhowmik Natural and intuitive user interfaces with perceptual computing technologies
CN113625878B (zh) 手势信息处理方法、装置、设备、存储介质及程序产品