CN112346630A - State determination method, device, equipment and computer readable medium - Google Patents

State determination method, device, equipment and computer readable medium Download PDF

Info

Publication number
CN112346630A
CN112346630A CN202011161541.4A CN202011161541A CN112346630A CN 112346630 A CN112346630 A CN 112346630A CN 202011161541 A CN202011161541 A CN 202011161541A CN 112346630 A CN112346630 A CN 112346630A
Authority
CN
China
Prior art keywords
target
image
displayed
turning
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011161541.4A
Other languages
Chinese (zh)
Other versions
CN112346630B (en
Inventor
区锦雄
郭冠军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Youzhuju Network Technology Co Ltd
Original Assignee
Beijing Youzhuju Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Youzhuju Network Technology Co Ltd filed Critical Beijing Youzhuju Network Technology Co Ltd
Priority to CN202011161541.4A priority Critical patent/CN112346630B/en
Publication of CN112346630A publication Critical patent/CN112346630A/en
Application granted granted Critical
Publication of CN112346630B publication Critical patent/CN112346630B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure discloses a state determination method, a state determination device, an electronic device and a computer readable medium. One embodiment of the method comprises: acquiring a target image on which a first number of target devices are displayed, wherein each target device in the first number of target devices comprises a turning device, a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state comprises: flipped, not flipped, and intermediate states; and identifying the first number of turning marks displayed in the target image to obtain an identification result, wherein the identification result is used for representing the turning state of each target device. The embodiment identifies the turning mark to determine the turning state of the turning device in the target device. The state of the identified equipment is more accurate, and the subsequent interactive command determined according to the identified state of the equipment is further more accurate, so that the interactive effect is better.

Description

State determination method, device, equipment and computer readable medium
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a state determination method, apparatus, device, and computer-readable medium.
Background
Computer vision recognition processing techniques have now reached a state of maturity and are beginning to be applied in the field of educational or gaming applications. The method is characterized in that the camera shooting function of terminal equipment such as a mobile phone and a tablet personal computer is realized, and the recognition algorithm technology is adopted. And interacting with the interface by using the equipment to be identified. But the identification accuracy of the equipment to be identified is not high in the related technology, so that the interaction effect is poor.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a state determination method, apparatus, device and computer readable medium to solve the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a state determination method, the method comprising: acquiring a target image on which a first number of target devices are displayed, wherein each target device in the first number of target devices comprises a turning device, a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state comprises: flipped, not flipped, and intermediate states; and identifying the first number of turning marks displayed in the target image to obtain an identification result, wherein the identification result is used for representing the turning state of each target device.
In a second aspect, some embodiments of the present disclosure provide a state determination device, the device comprising: an obtaining unit configured to obtain a target image on which a first number of target devices are displayed, each of the first number of target devices including a flipping unit on which a flipping flag representing a flipping state of the flipping unit is displayed, the flipping state including: flipped, not flipped, and intermediate states; and the identification unit is configured to identify the first number of turning marks displayed in the target image to obtain an identification result, and the identification result is used for representing the turning state of each target device.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method as described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the disclosure provide a computer readable medium having a computer program stored thereon, where the program when executed by a processor implements a method as described in any of the implementations of the first aspect.
One of the above-described various embodiments of the present disclosure has the following advantageous effects: the identified state of the equipment is more accurate, so that the subsequent interactive command determined according to the identified state of the equipment is more accurate, and the interactive effect is better. Specifically, the inventor finds that the reason that the identification accuracy of the device to be identified is not high in the related art is that the attribute of the device to be identified does not change obviously when the device to be identified is interacted with. Based on this, the scheme provides a state determination method by recognizing the device to be recognized with the turnover device. Specifically, the turning device displays a turning mark representing a turning state of the turning device. The turning state of the turning device in the target apparatus can then be determined by recognizing the turning mark. Because the overturning mark is obviously changed before and after the overturning device overturns, the state of the equipment identified by the scheme is more accurate.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
FIG. 1 is a schematic diagram of one application scenario of a state determination method of some embodiments of the present disclosure;
FIG. 2 is a flow diagram of some embodiments of a state determination method according to the present disclosure;
FIG. 3 is a flow diagram of further embodiments of a state determination method according to the present disclosure;
FIG. 4 is a schematic block diagram of some embodiments of a state determination device according to the present disclosure;
FIG. 5 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure;
FIG. 6 is a schematic block diagram of a target device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 shows a schematic diagram of one application scenario in which the state determination method of some embodiments of the present disclosure may be applied.
In the application scenario shown in fig. 1, the computing device 101 may first acquire a target image 102 with a first number of target devices displayed. Each target device of the first number of target devices includes a turning device, a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state includes: flipped, not flipped, and intermediate states. In the present application scenario, three target devices are displayed in the target image 102. Respectively, device a, device B and device C. Thereafter, the computing device 101 identifies the first number of flip marks displayed in the target image, and obtains an identification result 103. In this application scenario, the flip mark is a color mark of the flip device. And in the application scene, the state of the device A is not inverted, the device B is inverted, and the device C is in an intermediate state.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of a plurality of servers or electronic devices, or may be implemented as a single server or a single electronic device. When the computing device is embodied as software, it may be implemented as multiple pieces of software or software modules, for example, to provide distributed services, or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices 101 in FIG. 1 is merely illustrative. There may be any number of computing devices 101, as desired for implementation.
With continued reference to fig. 2, a flow 200 of some embodiments of a state determination method according to the present disclosure is shown. The state determination method comprises the following steps:
step 201, acquiring a target image on which a first number of target devices are displayed, where each target device in the first number of target devices includes a turning device, where a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state includes: flipped, not flipped, and intermediate states.
In some embodiments, the turning device may be a spherical device supported on the groove by a support device. The support means may provide the spherical means with rotational freedom in either direction. That is, the spherical device may be rotated in this direction.
In some embodiments, the subject (e.g., the computing device shown in fig. 1) performing the state determination method may obtain the target image through a wired connection or a wireless connection. It should be noted that the wireless connection means may include, but is not limited to, a 3G/4G connection, a WiFi connection, a bluetooth connection, a WiMAX connection, a Zigbee connection, a uwb (ultra wideband) connection, and other wireless connection means now known or developed in the future.
In some embodiments, the first number may be any positive integer according to practical requirements.
In some embodiments, the flipping mark may be any mark that can indicate the flipping state of the flipping unit. For example, it may be a color mark or a text mark. Specifically, different colors may be marked on two surfaces displayed before and after the turnover device is turned over by 180 degrees.
In some embodiments, the non-flipped state may be an original state of the flipping unit. The flipped state may be a state in which the flipped state is flipped by 180 degrees. The intermediate state may be a state other than the inverted state and the non-inverted state.
Step 202, identifying the first number of turning marks displayed in the target image to obtain an identification result, where the identification result is used to represent a turning state of each target device.
In some embodiments, the execution subject may recognize the first number of flip marks displayed in the target image using an image recognition network. By way of example, the image recognition networks described above may include, but are not limited to, VGGNet (Visual Geometry Group Network), ResNet (Residual Neural Network), and DenseNet (Dense Network Dense connection Network).
In some optional implementations of some embodiments, the target device further includes a positioning device configured to determine an image area in the target image where the flip mark is displayed. As an example, the positioning means may be a ring-shaped structure surrounding the flipping means on the corresponding target device.
With further reference to fig. 6, fig. 6 shows a schematic diagram 600 of a target device in the present scenario. In the figure, three target devices 601, 602, 603 are shown. And, the content indicated by reference numeral 604 is a flipping means in the target apparatus 601. Reference numeral 605 denotes a positioning means in the target apparatus 601.
On this basis, the execution body may first determine the first number of image areas based on the first number of positioning devices displayed in the target image. And then, extracting the first number of image areas from the target image to obtain a first number of images. And finally, inputting the first number of images into a pre-trained image recognition network to obtain the recognition result.
As an example, the execution subject determines the first number of image areas based on a first number of positioning devices displayed in the target image. The first number of positioning devices displayed in the target image may be determined by a target detection technique. And obtaining a first number of positioning frames and further obtaining the first number of image areas.
In some optional implementations of some embodiments, the target device displays a distinguishing mark for distinguishing different target devices, and the target device further includes a positioning device for determining an image area in the target image where the flipping mark and the distinguishing mark are displayed. On this basis, the execution body may first determine the first number of image areas based on the first number of positioning devices displayed in the target image. And then, the distinguishing mark and the turning mark displayed in each image area in the first number of image areas are distinguished to obtain the identification result.
Some embodiments of the present disclosure provide methods for determining a flipping state of a flipping unit in a target device by identifying a flipping mark. The state of the identified equipment is more accurate, and the subsequent interactive command determined according to the identified state of the equipment is further more accurate, so that the interactive effect is better.
With further reference to fig. 3, a flow 300 of further embodiments of a state determination method is illustrated. The process 300 of the state determination method includes the following steps:
step 301, acquiring a target image on which a first number of target devices are displayed, where each target device in the first number of target devices includes a flipping unit, where a flipping mark representing a flipping state of the flipping unit is displayed on the flipping unit, and the flipping state includes: flipped, not flipped, and intermediate states.
In some embodiments, the specific implementation of step 301 and the technical effect thereof may refer to step 201 in the embodiment corresponding to fig. 2, and are not described herein again.
Step 302, the target device displays a distinguishing mark for distinguishing different target devices, the target device further includes a positioning device for determining an image area in the target image, where the flipping mark and the distinguishing mark are displayed, and the first number of image areas are determined based on the first number of positioning devices displayed in the target image.
In some embodiments, the executing entity may determine the first number of positioning devices displayed in the target image by using an object detection technique, obtain a first number of positioning frames, and further obtain the first number of image areas.
In some embodiments, the executing body may further determine, by using an object detection technique, the first number of positioning devices displayed in the object image, so as to obtain a first number of positioning frames. And then, amplifying the first number of positioning frames according to a preset proportion to obtain an amplified positioning frame, and further obtaining the first number of image areas.
Step 303, identifying the distinguishing mark and the turning mark displayed in each image area of the first number of image areas to obtain the identification result.
In some embodiments, the executing body may recognize the distinguishing mark and the turning mark displayed in each of the first number of image areas using image recognition software or an in-line image processing tool, and obtain the recognition result.
In some embodiments, the execution subject may further identify the distinguishing mark and the flipping mark displayed in each of the first number of image regions using an image recognition network, resulting in the identification result. By way of example, the image recognition network described above may include, but is not limited to: FCN Network (full volumetric Network), SegNet Network (Semantic Segmentation Network), deep lab voice Segmentation Network, PSPNet Network (Semantic Segmentation Network), Mask-RCNN Network (Mask-Region-CNN, image instance Segmentation Network).
And step 304, determining an interactive command corresponding to the identification result.
In some embodiments, the interactive commands described above may be any number and any type of commands. For example, the function of the interactive command may be to play an audio, show an animation, jump to a web page, etc.
In some embodiments, the correspondence between the recognition result and the interactive command may be predetermined. On this basis, the execution subject may determine the interactive command corresponding to the recognition result through a preset correspondence relationship.
Step 305, executing the above-mentioned interactive command.
As can be seen from fig. 3, compared with the description of some embodiments corresponding to fig. 2, the scheme described in the embodiments of the process 300 of the state determination method in some embodiments corresponding to fig. 3 enables the positioning of the flipping mark to be more accurate by determining the first number of image areas displaying the flipping mark and the distinguishing mark based on the positioning frame, and further enables the identification result to be more accurate. And by introducing the distinguishing mark, the corresponding relation between the image area and the target equipment can be accurately determined when at least two target equipment are contained, and the reliability of the identification result is ensured. In addition, the interactive command is determined by using the recognition result, and then the interactive command is executed, so that the efficiency and the quality of the interactive task are improved.
With further reference to fig. 4, as an implementation of the methods illustrated in the above figures, the present disclosure provides some embodiments of a state determination apparatus, which correspond to those method embodiments illustrated in fig. 2, and which may be particularly applicable in various electronic devices.
As shown in fig. 4, the state determination device 400 of some embodiments includes: an acquisition unit 401 and a recognition unit 402. The acquiring unit 401 is configured to acquire a target image on which a first number of target devices are displayed, where each target device in the first number of target devices includes a flipping unit, and a flipping mark representing a flipping state of the flipping unit is displayed on the flipping unit, where the flipping state includes: flipped, not flipped, and intermediate states; an identifying unit 402, configured to identify the first number of turning marks displayed in the target image, and obtain an identification result, where the identification result is used to characterize a turning state of each target device.
In an optional implementation of some embodiments, the apparatus 400 further comprises: a determining unit configured to determine an interactive command corresponding to the recognition result; and the execution unit is configured to execute the interactive command.
In an optional implementation manner of some embodiments, the target device further includes a positioning device for determining an image area in the target image, where the flip mark is displayed; and the identifying unit 402 is further configured to: determining the first number of image regions based on the first number of positioning devices displayed in the target image; and carrying out image recognition on the first number of image areas to obtain the recognition result.
In an optional implementation of some embodiments, the identifying unit 402 is further configured to: determining the first number of image regions based on the first number of positioning devices displayed in the target image; extracting the first number of image areas from the target image to obtain a first number of images; and inputting the first number of images into a pre-trained image recognition network to obtain the recognition result.
In an optional implementation manner of some embodiments, the target device displays a distinguishing mark for distinguishing different target devices, and the target device further includes a positioning device for determining an image area in the target image, where the flipping mark and the distinguishing mark are displayed; and the identifying unit 402 is further configured to: determining the first number of image regions based on the first number of positioning devices displayed in the target image; and identifying the distinguishing mark and the turning mark displayed in each image area in the first number of image areas to obtain the identification result.
It will be understood that the elements described in the apparatus 400 correspond to various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 400 and the units included therein, and will not be described herein again.
Referring now to FIG. 5, a block diagram of an electronic device (e.g., the computing device of FIG. 1) 500 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device in some embodiments of the present disclosure may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a vehicle-mounted terminal (e.g., a car navigation terminal), and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 5 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program, when executed by the processing device 501, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a target image on which a first number of target devices are displayed, wherein each target device in the first number of target devices comprises a turning device, a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state comprises: flipped, not flipped, and intermediate states; and identifying the first number of turning marks displayed in the target image to obtain an identification result, wherein the identification result is used for representing the turning state of each target device.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit and an identification unit. The names of these units do not in some cases constitute a limitation on the unit itself, and for example, the acquisition unit may also be described as a "unit that acquires a target image".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
According to one or more embodiments of the present disclosure, there is provided a state determination method including: acquiring a target image on which a first number of target devices are displayed, wherein each target device in the first number of target devices comprises a turning device, a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state comprises: flipped, not flipped, and intermediate states; and identifying the first number of turning marks displayed in the target image to obtain an identification result, wherein the identification result is used for representing the turning state of each target device.
In accordance with one or more embodiments of the present disclosure, a method further comprises: determining an interactive command corresponding to the recognition result; and executing the interactive command.
According to one or more embodiments of the present disclosure, the target device further includes a positioning device for determining an image area in the target image, where the flip mark is displayed; and the recognizing the first number of turning marks displayed in the target image to obtain a recognition result, including: determining the first number of image regions based on the first number of positioning devices displayed in the target image; and carrying out image recognition on the first number of image areas to obtain the recognition result.
According to one or more embodiments of the present disclosure, image recognition on the first number of image regions to obtain the recognition result includes: extracting the first number of image areas from the target image to obtain a first number of images; and inputting the first number of images into a pre-trained image recognition network to obtain the recognition result.
According to one or more embodiments of the present disclosure, the target device displays a distinguishing mark for distinguishing different target devices, and the target device further includes a positioning device for determining an image area in the target image where the flipping mark and the distinguishing mark are displayed; and the recognizing the first number of turning marks displayed in the target image to obtain a recognition result, including: determining the first number of image regions based on the first number of positioning devices displayed in the target image; and identifying the distinguishing mark and the turning mark displayed in each image area in the first number of image areas to obtain the identification result.
According to one or more embodiments of the present disclosure, there is provided a state determination device including: an obtaining unit configured to obtain a target image on which a first number of target devices are displayed, each of the first number of target devices including a flipping unit on which a flipping flag representing a flipping state of the flipping unit is displayed, the flipping state including: flipped, not flipped, and intermediate states; and the identification unit is configured to identify the first number of turning marks displayed in the target image to obtain an identification result, and the identification result is used for representing the turning state of each target device.
According to one or more embodiments of the present disclosure, an apparatus further comprises: a determining unit configured to determine an interactive command corresponding to the recognition result; and the execution unit is configured to execute the interactive command.
According to one or more embodiments of the present disclosure, the target device further includes a positioning device for determining an image area in the target image, where the flip mark is displayed; and the identification unit is further configured to: determining the first number of image regions based on the first number of positioning devices displayed in the target image; and carrying out image recognition on the first number of image areas to obtain the recognition result.
According to one or more embodiments of the present disclosure, the above-mentioned identification unit is further configured to: determining the first number of image regions based on the first number of positioning devices displayed in the target image; extracting the first number of image areas from the target image to obtain a first number of images; and inputting the first number of images into a pre-trained image recognition network to obtain the recognition result.
According to one or more embodiments of the present disclosure, a target device displays a distinguishing mark for distinguishing different target devices, and the target device further includes a positioning device for determining an image area in which the turning mark and the distinguishing mark are displayed in the target image; and the identification unit is further configured to: determining the first number of image regions based on the first number of positioning devices displayed in the target image; and identifying the distinguishing mark and the turning mark displayed in each image area in the first number of image areas to obtain the identification result.
According to one or more embodiments of the present disclosure, there is provided an electronic device including: one or more processors; a storage device having one or more programs stored thereon which, when executed by one or more processors, cause the one or more processors to implement a method as in any above.
According to one or more embodiments of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, wherein the program, when executed by a processor, implements the method as any one of the above.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A method of state determination, comprising:
acquiring a target image on which a first number of target devices are displayed, wherein each target device in the first number of target devices comprises a turning device, a turning mark representing a turning state of the turning device is displayed on the turning device, and the turning state comprises: flipped, not flipped, and intermediate states;
and identifying the first number of turning marks displayed in the target image to obtain an identification result, wherein the identification result is used for representing the turning state of each target device.
2. The method of claim 1, wherein the method further comprises:
determining an interactive command corresponding to the recognition result;
and executing the interactive command.
3. The method of claim 1, wherein the target device further comprises a positioning means for determining an image area in the target image where the flip mark is displayed; and
the identifying the first number of flip marks displayed in the target image to obtain an identification result includes:
determining the first number of image regions based on the first number of positioning devices displayed in the target image;
and carrying out image recognition on the first number of image areas to obtain the recognition result.
4. The method of claim 3, wherein the performing image recognition on the first number of image regions to obtain the recognition result comprises:
extracting the first number of image areas from the target image to obtain a first number of images;
and inputting the first number of images into a pre-trained image recognition network to obtain the recognition result.
5. The method according to claim 1, wherein the target device displays a distinguishing mark for distinguishing different target devices, the target device further comprising a positioning means for determining an image area in the target image where the flip mark and the distinguishing mark are displayed; and
the identifying the first number of flip marks displayed in the target image to obtain an identification result includes:
determining the first number of image regions based on the first number of positioning devices displayed in the target image;
and identifying the distinguishing mark and the turning mark displayed in each image area in the first number of image areas to obtain the identification result.
6. A state determination device, comprising:
an obtaining unit configured to obtain a target image on which a first number of target devices are displayed, each of the first number of target devices including a flipping unit on which a flipping mark representing a flipping state of the flipping unit is displayed, the flipping state including: flipped, not flipped, and intermediate states;
the identification unit is configured to identify the first number of turning marks displayed in the target image to obtain an identification result, and the identification result is used for representing the turning state of each target device.
7. The apparatus of claim 6, wherein the apparatus further comprises:
a determination unit configured to determine an interactive command corresponding to the recognition result;
an execution unit configured to execute the interactive command.
8. The apparatus of claim 7, wherein the target device further comprises a positioning device for determining an image area in the target image where the flip mark is displayed; and
the identification unit is further configured to:
determining the first number of image regions based on the first number of positioning devices displayed in the target image;
and carrying out image recognition on the first number of image areas to obtain the recognition result.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-5.
10. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-5.
CN202011161541.4A 2020-10-27 2020-10-27 State determination method, device, equipment and computer readable medium Active CN112346630B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011161541.4A CN112346630B (en) 2020-10-27 2020-10-27 State determination method, device, equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011161541.4A CN112346630B (en) 2020-10-27 2020-10-27 State determination method, device, equipment and computer readable medium

Publications (2)

Publication Number Publication Date
CN112346630A true CN112346630A (en) 2021-02-09
CN112346630B CN112346630B (en) 2022-09-27

Family

ID=74359071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011161541.4A Active CN112346630B (en) 2020-10-27 2020-10-27 State determination method, device, equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN112346630B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103569641A (en) * 2012-07-23 2014-02-12 南车青岛四方机车车辆股份有限公司 Multifunctional overturning device and method
CN105760106A (en) * 2016-03-08 2016-07-13 网易(杭州)网络有限公司 Interaction method and interaction device of intelligent household equipment
CN206229705U (en) * 2016-10-26 2017-06-09 福州外语外贸学院 Automatic distribution of goods device and automatic distribution of goods system
CN107710135A (en) * 2015-03-08 2018-02-16 苹果公司 Use the user interface of rotatable input mechanism
TW201837789A (en) * 2017-04-10 2018-10-16 創實雲端科技有限公司 Leaving or stopping recognition system being applied to various application environments to determine the flow rate, the situation of stopping and the like of the object
CN109697478A (en) * 2017-10-20 2019-04-30 山东新北洋信息技术股份有限公司 A kind of cargo identity information acquisition device and its control method
EP3387410A4 (en) * 2015-12-08 2019-12-04 The American University in Cairo Shear enhanced rolling (ser). a method to improve grain size uniformity in rolled alloy billets.

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103569641A (en) * 2012-07-23 2014-02-12 南车青岛四方机车车辆股份有限公司 Multifunctional overturning device and method
CN107710135A (en) * 2015-03-08 2018-02-16 苹果公司 Use the user interface of rotatable input mechanism
EP3387410A4 (en) * 2015-12-08 2019-12-04 The American University in Cairo Shear enhanced rolling (ser). a method to improve grain size uniformity in rolled alloy billets.
CN105760106A (en) * 2016-03-08 2016-07-13 网易(杭州)网络有限公司 Interaction method and interaction device of intelligent household equipment
CN206229705U (en) * 2016-10-26 2017-06-09 福州外语外贸学院 Automatic distribution of goods device and automatic distribution of goods system
TW201837789A (en) * 2017-04-10 2018-10-16 創實雲端科技有限公司 Leaving or stopping recognition system being applied to various application environments to determine the flow rate, the situation of stopping and the like of the object
CN109697478A (en) * 2017-10-20 2019-04-30 山东新北洋信息技术股份有限公司 A kind of cargo identity information acquisition device and its control method

Also Published As

Publication number Publication date
CN112346630B (en) 2022-09-27

Similar Documents

Publication Publication Date Title
CN110809189B (en) Video playing method and device, electronic equipment and computer readable medium
CN111399956A (en) Content display method and device applied to display equipment and electronic equipment
CN111784712B (en) Image processing method, device, equipment and computer readable medium
CN111399729A (en) Image drawing method and device, readable medium and electronic equipment
CN111190520A (en) Menu item selection method and device, readable medium and electronic equipment
CN111459364B (en) Icon updating method and device and electronic equipment
CN111596991A (en) Interactive operation execution method and device and electronic equipment
CN112257582A (en) Foot posture determination method, device, equipment and computer readable medium
CN112418249A (en) Mask image generation method and device, electronic equipment and computer readable medium
CN111461968A (en) Picture processing method and device, electronic equipment and computer readable medium
CN110851032A (en) Display style adjustment method and device for target device
CN111461965B (en) Picture processing method and device, electronic equipment and computer readable medium
CN113191257A (en) Order of strokes detection method and device and electronic equipment
CN112101258A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN112258622A (en) Image processing method, image processing device, readable medium and electronic equipment
CN113628097A (en) Image special effect configuration method, image recognition method, image special effect configuration device and electronic equipment
CN112000251A (en) Method, apparatus, electronic device and computer readable medium for playing video
CN111756953A (en) Video processing method, device, equipment and computer readable medium
US20230281983A1 (en) Image recognition method and apparatus, electronic device, and computer-readable medium
CN112346630B (en) State determination method, device, equipment and computer readable medium
CN112764629B (en) Augmented reality interface display method, device, equipment and computer readable medium
CN111586295B (en) Image generation method and device and electronic equipment
CN114004229A (en) Text recognition method and device, readable medium and electronic equipment
CN112488947A (en) Model training and image processing method, device, equipment and computer readable medium
CN112418233A (en) Image processing method, image processing device, readable medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant