CN108874141B - Somatosensory browsing method and device - Google Patents

Somatosensory browsing method and device Download PDF

Info

Publication number
CN108874141B
CN108874141B CN201810661716.4A CN201810661716A CN108874141B CN 108874141 B CN108874141 B CN 108874141B CN 201810661716 A CN201810661716 A CN 201810661716A CN 108874141 B CN108874141 B CN 108874141B
Authority
CN
China
Prior art keywords
human body
display screen
hand
image
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810661716.4A
Other languages
Chinese (zh)
Other versions
CN108874141A (en
Inventor
赵涛涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
JD Digital Technology Holdings Co Ltd
Jingdong Technology Holding Co Ltd
Original Assignee
JD Digital Technology Holdings Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by JD Digital Technology Holdings Co Ltd filed Critical JD Digital Technology Holdings Co Ltd
Priority to CN201810661716.4A priority Critical patent/CN108874141B/en
Publication of CN108874141A publication Critical patent/CN108874141A/en
Application granted granted Critical
Publication of CN108874141B publication Critical patent/CN108874141B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a somatosensory browsing method and device and relates to the technical field of computers. One embodiment of the method comprises: if the human body is detected within a preset identification range, identifying the hand characteristics of the human body; acquiring a motion track of the hand characteristics of the human body, and mapping the motion track to a display screen; and determining the moving direction of the current display image on the display screen according to the motion trail of the hand characteristics of the human body, so as to move the current display image on the display screen. The implementation method can solve the problems of unsmooth interaction and poor reality experience.

Description

Somatosensory browsing method and device
Technical Field
The invention relates to the technical field of computers, in particular to a somatosensory browsing method and device.
Background
Currently, there are three main browsing methods:
1) based on the panoramic browsing of the mobile terminal, the view of scenes in various directions in a spherical range is realized by touching and dragging a screen by fingers;
2) based on the panoramic browsing of a large-scale touch screen, the view of scenes in various directions in a spherical range is realized by touching and dragging the screen by fingers;
3) the panoramic browsing of the non-touch screen web end realizes the watching of scenes in all directions in a spherical range by means of mouse dragging or keyboard operation.
In the process of implementing the invention, the inventor finds that at least the following problems exist in the prior art:
1) the mobile terminal-based panoramic browsing has the problems that a mobile terminal screen is small, the presence reality is lacked, better experience cannot be obtained, and an interaction mode is not fresh; 2) panoramic browsing based on a large touch screen has the problems of higher cost of the display screen and no freshness of an interaction mode; 3) the panoramic browsing of the non-touch screen web end has the problems of not convenient and fast interaction mode, not smooth interaction and lack of presence reality sense.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method and an apparatus for somatosensory browsing, which can solve the problems of unsmooth interaction and poor reality experience.
In order to achieve the above object, according to an aspect of an embodiment of the present invention, there is provided a somatosensory browsing method including:
if the human body is detected within a preset identification range, identifying the hand characteristics of the human body;
acquiring a motion track of the hand characteristics of the human body, and mapping the motion track to a display screen;
and determining the moving direction of the current display image on the display screen according to the motion trail of the hand characteristics of the human body, so as to move the current display image on the display screen.
Optionally, determining a moving direction of a currently displayed image on the display screen according to the motion trajectory of the hand feature of the human body, including:
determining the motion distance and the motion direction of the hand features of the human body according to the motion trail of the hand features of the human body;
and determining the moving distance and the moving direction of the current display image on the display screen according to the moving distance and the moving direction of the hand features of the human body.
Optionally, the method further comprises:
if the fact that the time of the hand features of the human body mapped on the preset area of the display screen exceeds a time threshold value is detected, a jump instruction corresponding to the preset area is triggered; and/or the presence of a gas in the gas,
and if the hand characteristics of the human body mapped on the preset area of the display screen are detected to be changed, triggering a jump instruction corresponding to the preset area.
Optionally, the method further comprises:
and if the human body is not detected in the preset identification range, displaying the image on the display screen, and moving the image at a preset speed and in a preset direction.
Optionally, the hand feature comprises a fist-making feature and the image comprises a panoramic image.
In addition, according to another aspect of the embodiments of the present invention, there is provided a somatosensory browsing apparatus including:
the identification module is used for identifying the hand characteristics of the human body if the human body is detected within a preset identification range;
the mapping module is used for acquiring the motion trail of the hand characteristics of the human body and mapping the motion trail to a display screen;
and the moving module is used for determining the moving direction of the current display image on the display screen according to the motion trail of the hand characteristics of the human body, so that the current display image is moved on the display screen.
Optionally, determining a moving direction of a currently displayed image on the display screen according to the motion trajectory of the hand feature of the human body, including:
determining the motion distance and the motion direction of the hand features of the human body according to the motion trail of the hand features of the human body;
and determining the moving distance and the moving direction of the current display image on the display screen according to the moving distance and the moving direction of the hand features of the human body.
Optionally, the apparatus further comprises a triggering module configured to:
if the fact that the time of the hand features of the human body mapped on the preset area of the display screen exceeds a time threshold value is detected, a jump instruction is triggered; and/or the presence of a gas in the gas,
and if the hand characteristics of the human body mapped on the preset area of the display screen are detected to be changed, triggering a jump instruction.
Optionally, the identification module is further configured to:
and if the human body is not detected in the preset identification range, displaying the image on the display screen, and moving the image at a preset speed and in a preset direction.
Optionally, the hand feature comprises a fist-making feature and the image comprises a panoramic image.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any of the embodiments described above.
According to another aspect of the embodiments of the present invention, there is also provided a computer readable medium, on which a computer program is stored, which when executed by a processor implements the method of any of the above embodiments.
One embodiment of the above invention has the following advantages or benefits: because the moving direction of the current display image on the display screen is determined according to the motion track of the hand characteristics of the human body, and the technical means of moving the current display image on the display screen is adopted, the technical problems of unsmooth interaction and poor experience sense of reality are solved; the method comprises the steps of carrying out interaction through a body sensing device, obtaining a motion track of hand characteristics of a human body, determining the moving direction of a current display image on a display screen according to the motion track of the hand characteristics of the human body, and moving the current display image on the display screen. Therefore, the method and the device have the advantages of novel interaction mode, smooth interaction and easiness in learning, enable the user to obtain better experience, enhance the sense of reality of the experience, realize large-screen display of the panoramic image and save the cost as much as possible.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a better understanding of the invention and are not to be construed as unduly limiting the invention. Wherein:
fig. 1 is a schematic diagram of a main flow of a somatosensory browsing method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a display module, a motion sensing module and a data processing module constructed according to an embodiment of the invention;
fig. 3 is a schematic diagram of a main flow of a somatosensory browsing method according to a referential embodiment of the invention;
fig. 4 is a schematic diagram of main blocks of a somatosensory browsing device according to an embodiment of the present invention;
FIG. 5 is an exemplary system architecture diagram in which embodiments of the present invention may be employed;
fig. 6 is a schematic block diagram of a computer system suitable for use in implementing a terminal device or server of an embodiment of the invention.
Detailed Description
Exemplary embodiments of the present invention are described below with reference to the accompanying drawings, in which various details of embodiments of the invention are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the invention. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic diagram of a main flow of a somatosensory browsing method according to an embodiment of the present invention. As shown in fig. 1, as an embodiment of the present invention, the somatosensory browsing method may include:
step 101, if a human body is detected within a preset identification range, identifying hand characteristics of the human body.
In the step, experience data can be obtained through the body sensing device, whether a human body exists in a preset identification range is judged based on the experience data, if yes, hand characteristics of the human body are identified, and the hand characteristics can be a fist making characteristic, a palm characteristic, a finger characteristic and the like.
Optionally, the motion sensing device may be Kinect, which is a name formally issued by microsoft to the XBOX360 motion sensing peripheral device in 2010, 6 months and 14 days. The Kinect is a 3D motion sensing camera, and has the functions of real-time motion capture, image recognition, microphone input, voice recognition, community interaction, and the like.
In order to implement the present invention, a display module, a motion sensing module, and a data processing module are required. Specifically, the display module may be a display screen, the experience module may be a motion sensing device, such as a Kinect, the data processing module may be a processor, the data processing module is built as shown in fig. 2, the recognition range of the Kinect is confirmed during building, and the data processing module can be adjusted and arranged at a position suitable for operation of most people through multiple tests and adjusted in angle.
The Kinect can have two modes: one is a half-length mode, i.e., a near mode; the other is the default mode, which is the whole body mode. Near mode depth distance range, physical limit: 0.4 m to 3 m; a comfort zone: 0.8 m to 2.5 m. Depth distance range of default mode, physical limit: a 0.8 to 4 meter (default) depth extension (over 4 meters) may also be detected, but skeletal and player tracked noise may become larger with distance and therefore may not be reliable at this time; a comfort zone: 1.2 m to 3.5 m. View angle size (depth and RGB), horizontal angle: 57.5 degrees; vertical angle: 43.5 degrees, can be up and down in the range of plus or minus 27 degrees.
Optionally, the method further comprises: and if the human body is not detected in the preset identification range, displaying the image on the display screen, and moving the image at a preset speed and in a preset direction. In some embodiments of the present invention, if a human body is detected within a preset identification range in front of a display screen, the hand characteristics of the human body are identified through somatosensory data acquired by a somatosensory device; and if the human body is not detected in the preset identification range in front of the display screen, displaying the panoramic image on the display screen, and moving the panoramic image at a preset speed and in a preset direction. It should be noted that, by presetting the moving speed and moving direction of the panoramic image, the panoramic image can move according to the preset speed and the preset direction without detecting the human body, and the experience of the user in watching the panoramic image is improved.
Before implementing the embodiment of the invention, a panoramic image needs to be produced, the production of the panoramic image comprises image acquisition by using a single-lens reflex or professional panoramic camera, and the image is spliced into the panoramic image.
And 102, acquiring a motion track of the hand characteristics of the human body, and mapping the motion track to a display screen.
After a human body is determined to be in the preset range, the motion trail of the hand features of the human body is identified and acquired, and the motion trail is mapped onto the display screen by calculating the coordinates of the display screen, so that a user can visually see the motion trail of the hand features and timely adjust the motion trail of the hand features.
Optionally, the hand characteristics of the human body can be further simulated, and the motion track is mapped on the display screen in the mode of simulating the hand characteristics. For example, when the hand feature is a fist making feature, the motion track is mapped on the display screen in a mode of simulating the fist making feature, when the hand of the human body is in a fist making state and moves in the fist making state, the fist making state of the human body is simulated to be the fist making state mapped on the screen through the captured motion track of the hand fist making of the human body, the hand feature of the human body is kept consistent with the hand feature on the screen, and the motion tracks of the hand feature and the hand feature are also kept consistent, so that the sense of reality of the user is enhanced.
Optionally, before step 102, the method may further include: and displaying the using method on a display screen. In general, the method of use may be: the user triggers the movement of the image and how to control the movement direction, how to trigger the next scene, how to exit, and the like by what way, which is not limited in the embodiment of the present invention.
And 103, determining the moving direction of the current display image on the display screen according to the motion track of the hand features of the human body, so as to move the current display image on the display screen.
Optionally, in this step, the moving direction of the panoramic image currently displayed on the display screen is calculated in real time according to the motion track of the hand feature of the human body,
in yet another embodiment of the present invention, determining a moving direction of a currently displayed image on a display screen according to a motion trajectory of a hand feature of a human body includes: determining the motion distance and the motion direction of the hand features of the human body according to the motion trail of the hand features of the human body; and determining the moving distance and the moving direction of the current display image on the display screen according to the moving distance and the moving direction of the hand features of the human body. In this embodiment, the motion distance and the motion direction of the human hand feature are calculated in real time through the motion trajectory of the human hand feature acquired in real time, and then the moving distance and the moving direction of the currently displayed image on the display screen are determined according to the motion distance and the motion direction of the hand feature, so that the currently displayed image is moved on the display screen.
In yet another embodiment of the present invention, the simulated fist feature mapped onto the display screen is moved with the currently presented panoramic image so that the simulated fist feature drags the panoramic image and the user can view the scene in various directions within the sphere.
Optionally, the method may further include: if the fact that the time of the hand features of the human body mapped on the preset area of the display screen exceeds a time threshold value is detected, a jump instruction corresponding to the preset area is triggered; and/or triggering a jump instruction corresponding to a preset area if the change of the hand characteristics of the human body mapped on the preset area of the display screen is detected.
Specifically, in one embodiment, if it is detected that the time that the fist feature of the human body is mapped on a preset area (such as a specific icon- "next scene") of the display screen exceeds a time threshold, a jump instruction is triggered, and the next scene can be jumped to. Still alternatively, if a change in a fist making characteristic of the human body mapped on a preset area (such as a specific icon- "next scene") of the display screen is detected, such as a change in opening-fist making-opening-fist making, a jump instruction is triggered, and a jump can be made to the next scene.
In fact, in the embodiment of the invention, the hand characteristics of the human body can be simulated as the click event of the mouse, so that the sense of reality of human-computer interaction is improved, and the user experience is obviously improved.
According to the various embodiments, the technical scheme that the moving direction of the current display image on the display screen is determined according to the motion track of the hand characteristics of the human body is adopted, so that the current display image is moved on the display screen, and the problems of unsmooth interaction and poor experience sense of reality are solved. That is to say, in the prior art, the problems of lack of presence reality, no better experience, insufficient smoothness of interaction and no freshness of interaction mode exist. The motion trajectory of the hand characteristics of the human body is acquired through interaction of the motion sensing device, the moving direction of the current display image on the display screen is determined according to the motion trajectory of the hand characteristics of the human body, and the current display image is moved on the display screen. Therefore, the method and the device have the advantages of novel interaction mode, smooth interaction and easiness in learning, enable the user to obtain better experience, enhance the sense of reality of the experience, realize large-screen display of the panoramic image and save the cost as much as possible.
Fig. 3 is a schematic diagram of a main flow of a somatosensory browsing method according to a referential embodiment of the present invention, where the somatosensory browsing method may include:
step 301, whether a human body is detected within a preset identification range or not is judged; if yes, go to step 302; if not, go to step 306;
step 302, obtaining a motion track of the hand characteristics of the human body, and mapping the motion track to a display screen;
step 303, determining a movement distance and a movement direction of the hand features of the human body according to the movement track of the hand features of the human body;
step 304, determining the moving distance and the moving direction of the panoramic image currently displayed on the display screen according to the moving distance and the moving direction of the hand features of the human body;
step 305, moving the currently displayed panoramic image on a display screen;
step 306, displaying the panoramic image on the display screen, and moving the image at a preset speed and in a preset direction.
In addition, in a referential embodiment of the present invention, the specific implementation content of the motion sensing browsing method is already described in detail in the above motion sensing browsing method, so that the repeated content is not described again.
Fig. 4 is a schematic diagram of main blocks of a somatosensory browsing device according to an embodiment of the present invention. As shown in fig. 4, the somatosensory browsing apparatus 400 includes a recognition module 401, a mapping module 402, and a moving module 403. The identification module 401 is configured to identify a hand feature of a human body if the human body is detected within a preset identification range; the mapping module 402 is configured to obtain a motion trajectory of the hand feature of the human body, and map the motion trajectory to a display screen; the moving module 403 is configured to determine a moving direction of the currently displayed image on the display screen according to the motion trajectory of the hand feature of the human body, so as to move the currently displayed image on the display screen.
Optionally, determining a moving direction of a currently displayed image on the display screen according to the motion trajectory of the hand feature of the human body, including:
determining the motion distance and the motion direction of the hand features of the human body according to the motion trail of the hand features of the human body;
and determining the moving distance and the moving direction of the current display image on the display screen according to the moving distance and the moving direction of the hand features of the human body.
Optionally, the apparatus further comprises a triggering module configured to:
if the fact that the time of the hand features of the human body mapped on the preset area of the display screen exceeds a time threshold value is detected, a jump instruction is triggered; and/or the presence of a gas in the gas,
and if the hand characteristics of the human body mapped on the preset area of the display screen are detected to be changed, triggering a jump instruction.
Optionally, the identification module is further configured to:
and if the human body is not detected in the preset identification range, displaying the image on the display screen, and moving the image at a preset speed and in a preset direction.
Optionally, the hand feature comprises a fist-making feature and the image comprises a panoramic image.
According to the various embodiments, the technical scheme that the moving direction of the current display image on the display screen is determined according to the motion track of the hand characteristics of the human body is adopted, so that the current display image is moved on the display screen, and the problems of unsmooth interaction and poor experience sense of reality are solved. That is to say, in the prior art, the problems of lack of presence reality, no better experience, insufficient smoothness of interaction and no freshness of interaction mode exist. The motion trajectory of the hand characteristics of the human body is acquired through interaction of the motion sensing device, the moving direction of the current display image on the display screen is determined according to the motion trajectory of the hand characteristics of the human body, and the current display image is moved on the display screen. Therefore, the method and the device have the advantages of novel interaction mode, smooth interaction and easiness in learning, enable the user to obtain better experience, enhance the sense of reality of the experience, realize large-screen display of the panoramic image and save the cost as much as possible.
The embodiments of the motion-sensing browsing device according to the present invention have been described in detail in the above-described motion-sensing browsing method, and therefore, the description thereof will not be repeated here.
Fig. 5 illustrates an exemplary system architecture 500 to which a somatosensory browsing method or a somatosensory browsing method of an embodiment of the invention may be applied.
As shown in fig. 5, the system architecture 500 may include terminal devices 501, 502, 503, a network 504, and a server 505. The network 504 serves to provide a medium for communication links between the terminal devices 501, 502, 503 and the server 505. Network 504 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 501, 502, 503 to interact with a server 505 over a network 504 to receive or send messages or the like. The terminal devices 501, 502, 503 may have installed thereon various communication client applications, such as shopping-like applications, web browser applications, search-like applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).
The terminal devices 501, 502, 503 may be various electronic devices having a display screen and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 505 may be a server providing various services, such as a background management server (for example only) providing support for shopping websites browsed by users using the terminal devices 501, 502, 503. The background management server may analyze and process the received data such as the product information query request, and feed back a processing result (for example, target push information and product information — only an example) to the terminal device.
It should be noted that the somatosensory browsing method provided by the embodiment of the present invention is generally executed in the terminal devices 501, 502, and 503, and accordingly, the somatosensory browsing apparatus is generally disposed in the terminal devices 501, 502, and 503.
It should be understood that the number of terminal devices, networks, and servers in fig. 5 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 6, a block diagram of a computer system 600 suitable for use with a terminal device implementing an embodiment of the invention is shown. The terminal device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to the embodiments of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the system of the present invention when executed by the Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium shown in the present invention can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present invention, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present invention, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present invention may be implemented by software or hardware. The described modules may also be provided in a processor, which may be described as: a processor includes an identification module, a mapping module, and a movement module, where the names of the modules do not in some cases constitute a limitation on the modules themselves.
As another aspect, the present invention also provides a computer-readable medium that may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to comprise: if the human body is detected within a preset identification range, identifying the hand characteristics of the human body; acquiring a motion track of the hand characteristics of the human body, and mapping the motion track to a display screen; and determining the moving direction of the current display image on the display screen according to the motion trail of the hand characteristics of the human body, so as to move the current display image on the display screen.
According to the technical scheme of the embodiment of the invention, because the moving direction of the current display image on the display screen is determined according to the motion track of the hand characteristics of the human body, so that the technical means of moving the current display image on the display screen is adopted, the technical problems of unsmooth interaction and poor experience sense of reality are solved; the method comprises the steps of carrying out interaction through a body sensing device, obtaining a motion track of hand characteristics of a human body, determining the moving direction of a current display image on a display screen according to the motion track of the hand characteristics of the human body, and moving the current display image on the display screen. Therefore, the method and the device have the advantages of novel interaction mode, smooth interaction and easiness in learning, enable the user to obtain better experience, enhance the sense of reality of the experience, realize large-screen display of the panoramic image and save the cost as much as possible.
The above-described embodiments should not be construed as limiting the scope of the invention. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (6)

1. A somatosensory browsing method is characterized by comprising the following steps:
if the human body is detected within a preset identification range, identifying hand features of the human body, wherein the hand features comprise fist making features; if the human body is not detected in the preset identification range, displaying an image on a display screen, and moving the image at a preset speed and in a preset direction;
acquiring a motion track of the hand characteristics of the human body, and mapping the motion track to a display screen;
determining the moving direction of a currently displayed image on a display screen according to the motion trail of the hand features of the human body, so as to move the currently displayed image on the display screen, wherein the image comprises a panoramic image;
if the fact that the time of the hand features of the human body mapped on the preset area of the display screen exceeds a time threshold value is detected, a jump instruction corresponding to the preset area is triggered; and/or triggering a jump instruction corresponding to a preset area if the change of the hand characteristics of the human body mapped on the preset area of the display screen is detected.
2. The method of claim 1, wherein determining a moving direction of a currently displayed image on a display screen according to a motion trajectory of a hand feature of the human body comprises:
determining the motion distance and the motion direction of the hand features of the human body according to the motion trail of the hand features of the human body;
and determining the moving distance and the moving direction of the current display image on the display screen according to the moving distance and the moving direction of the hand features of the human body.
3. A somatosensory browsing device, comprising:
the identification module is used for identifying hand features of a human body if the human body is detected within a preset identification range, wherein the hand features comprise fist making features; if the human body is not detected in the preset identification range, displaying an image on a display screen, and moving the image at a preset speed and in a preset direction;
the mapping module is used for acquiring a motion track of the hand characteristics of the human body and mapping the motion track to a display screen, wherein the image comprises a panoramic image;
the moving module is used for determining the moving direction of the current display image on the display screen according to the motion track of the hand characteristics of the human body, so that the current display image is moved on the display screen;
the triggering module is used for triggering a jump instruction corresponding to a preset area if the fact that the time of the hand characteristics of the human body mapped on the preset area of the display screen exceeds a time threshold value is detected; and/or triggering a jump instruction corresponding to a preset area if the change of the hand characteristics of the human body mapped on the preset area of the display screen is detected.
4. The apparatus of claim 3, wherein determining a moving direction of a currently displayed image on a display screen according to the motion trajectory of the hand feature of the human body comprises:
determining the motion distance and the motion direction of the hand features of the human body according to the motion trail of the hand features of the human body;
and determining the moving distance and the moving direction of the current display image on the display screen according to the moving distance and the moving direction of the hand features of the human body.
5. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of claim 1 or 2.
6. A computer-readable medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method of claim 1 or 2.
CN201810661716.4A 2018-06-25 2018-06-25 Somatosensory browsing method and device Active CN108874141B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810661716.4A CN108874141B (en) 2018-06-25 2018-06-25 Somatosensory browsing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810661716.4A CN108874141B (en) 2018-06-25 2018-06-25 Somatosensory browsing method and device

Publications (2)

Publication Number Publication Date
CN108874141A CN108874141A (en) 2018-11-23
CN108874141B true CN108874141B (en) 2021-03-30

Family

ID=64295699

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810661716.4A Active CN108874141B (en) 2018-06-25 2018-06-25 Somatosensory browsing method and device

Country Status (1)

Country Link
CN (1) CN108874141B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109828741A (en) * 2019-01-29 2019-05-31 北京字节跳动网络技术有限公司 Method and apparatus for playing audio
CN112578908A (en) * 2020-12-09 2021-03-30 京东数字科技控股股份有限公司 Somatosensory interaction method and device based on advertising player

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307309A (en) * 2011-07-29 2012-01-04 杭州电子科技大学 Somatosensory interactive broadcasting guide system and method based on free viewpoints
CN103309444A (en) * 2013-03-14 2013-09-18 江南大学 Kinect-based intelligent panoramic display method
CN103777748A (en) * 2012-10-26 2014-05-07 华为技术有限公司 Motion sensing input method and device
CN104636063A (en) * 2015-01-22 2015-05-20 杭州电魂网络科技股份有限公司 Construction method for electronic screen virtual rocker
CN105678693A (en) * 2016-01-25 2016-06-15 成都易瞳科技有限公司 Panorama video browsing-playing method
CN107544673A (en) * 2017-08-25 2018-01-05 上海视智电子科技有限公司 Body feeling interaction method and body feeling interaction system based on depth map information
CN107872561A (en) * 2016-09-27 2018-04-03 中兴通讯股份有限公司 A kind of terminal screen management method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102307309A (en) * 2011-07-29 2012-01-04 杭州电子科技大学 Somatosensory interactive broadcasting guide system and method based on free viewpoints
CN103777748A (en) * 2012-10-26 2014-05-07 华为技术有限公司 Motion sensing input method and device
CN103309444A (en) * 2013-03-14 2013-09-18 江南大学 Kinect-based intelligent panoramic display method
CN104636063A (en) * 2015-01-22 2015-05-20 杭州电魂网络科技股份有限公司 Construction method for electronic screen virtual rocker
CN105678693A (en) * 2016-01-25 2016-06-15 成都易瞳科技有限公司 Panorama video browsing-playing method
CN107872561A (en) * 2016-09-27 2018-04-03 中兴通讯股份有限公司 A kind of terminal screen management method and device
CN107544673A (en) * 2017-08-25 2018-01-05 上海视智电子科技有限公司 Body feeling interaction method and body feeling interaction system based on depth map information

Also Published As

Publication number Publication date
CN108874141A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
US10652462B2 (en) Method and system for 360 degree video coverage visualization
CN109905754B (en) Virtual gift receiving method and device and storage equipment
JP6013583B2 (en) Method for emphasizing effective interface elements
JP6072237B2 (en) Fingertip location for gesture input
US9268410B2 (en) Image processing device, image processing method, and program
CN112051961A (en) Virtual interaction method and device, electronic equipment and computer readable storage medium
US9400575B1 (en) Finger detection for element selection
RU2667720C1 (en) Method of imitation modeling and controlling virtual sphere in mobile device
CN114449162B (en) Method, device, computer equipment and storage medium for playing panoramic video
CN108874141B (en) Somatosensory browsing method and device
KR20220110493A (en) Method and apparatus for displaying objects in video, electronic devices and computer readable storage media
KR20220093091A (en) Labeling method and apparatus, electronic device and storage medium
CN108829329B (en) Operation object display method and device and readable medium
KR102462054B1 (en) Method and device for implementing user interface of live auction
JP2022551671A (en) OBJECT DISPLAY METHOD, APPARATUS, ELECTRONIC DEVICE, AND COMPUTER-READABLE STORAGE MEDIUM
CN110263743B (en) Method and device for recognizing images
CN109472873B (en) Three-dimensional model generation method, device and hardware device
CN103547982A (en) Identifying contacts and contact attributes in touch sensor data using spatial and temporal features
CN110765326A (en) Recommendation method, device, equipment and computer readable storage medium
CN116880726B (en) Icon interaction method and device for 3D space, electronic equipment and medium
CN113112613B (en) Model display method and device, electronic equipment and storage medium
CN113722644B (en) Method and device for selecting browsing point positions in virtual space based on external equipment
CN110955787B (en) User head portrait setting method, computer equipment and computer readable storage medium
CN112578908A (en) Somatosensory interaction method and device based on advertising player
CN110070600B (en) Three-dimensional model generation method, device and hardware device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 101111 Room 221, 2nd Floor, Block C, 18 Kechuang 11th Street, Beijing Economic and Technological Development Zone

Applicant after: JINGDONG DIGITAL TECHNOLOGY HOLDINGS Co.,Ltd.

Address before: 101111 Room 221, 2nd Floor, Block C, 18 Kechuang 11th Street, Beijing Economic and Technological Development Zone

Applicant before: BEIJING JINGDONG FINANCIAL TECHNOLOGY HOLDING Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 101111 Room 221, 2nd Floor, Block C, 18 Kechuang 11th Street, Beijing Economic and Technological Development Zone

Patentee after: Jingdong Technology Holding Co.,Ltd.

Address before: 101111 Room 221, 2nd Floor, Block C, 18 Kechuang 11th Street, Beijing Economic and Technological Development Zone

Patentee before: Jingdong Digital Technology Holding Co.,Ltd.

Address after: 101111 Room 221, 2nd Floor, Block C, 18 Kechuang 11th Street, Beijing Economic and Technological Development Zone

Patentee after: Jingdong Digital Technology Holding Co.,Ltd.

Address before: 101111 Room 221, 2nd Floor, Block C, 18 Kechuang 11th Street, Beijing Economic and Technological Development Zone

Patentee before: JINGDONG DIGITAL TECHNOLOGY HOLDINGS Co.,Ltd.

CP01 Change in the name or title of a patent holder