CN114489315B - Image processing system and image processing apparatus - Google Patents

Image processing system and image processing apparatus Download PDF

Info

Publication number
CN114489315B
CN114489315B CN202011261381.0A CN202011261381A CN114489315B CN 114489315 B CN114489315 B CN 114489315B CN 202011261381 A CN202011261381 A CN 202011261381A CN 114489315 B CN114489315 B CN 114489315B
Authority
CN
China
Prior art keywords
image information
image processing
dimensional
project
electronic device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011261381.0A
Other languages
Chinese (zh)
Other versions
CN114489315A (en
Inventor
杨钧翔
丁志宏
张凯杰
林大凯
李昱贤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mirage Start Co ltd
Original Assignee
Mirage Start Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mirage Start Co ltd filed Critical Mirage Start Co ltd
Priority to CN202011261381.0A priority Critical patent/CN114489315B/en
Publication of CN114489315A publication Critical patent/CN114489315A/en
Application granted granted Critical
Publication of CN114489315B publication Critical patent/CN114489315B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/162Interface to dedicated audio devices, e.g. audio drivers, interface to CODECs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Abstract

The invention discloses an image processing system and an image processing apparatus. The image processing system comprises an electronic device and an image processing device. The image processing device is connected with the electronic device. The image processing device displays image information of a floating three-dimensional input device. The image processing device interacts with an object through the three-dimensional input device image information to generate a plurality of control signals, and transmits the plurality of control signals to the electronic device.

Description

Image processing system and image processing apparatus
Technical Field
The present invention relates to an image processing system and an image processing apparatus, and more particularly, to an image processing system and an image processing apparatus.
Background
The current image processing system can only judge the design result from the display device. If the processed image information is a product model or even an indoor design, the image information can be turned over at various angles on the screen, but the image information is still not as intuitive as the image information is directly displayed on the eyes of the user.
Accordingly, it has become an important issue to be solved by the industry to provide an image processing system capable of intuitively processing and displaying video information.
Disclosure of Invention
The technical problem to be solved by the present invention is to provide an image processing system, which is characterized by comprising: an electronic device; and an image processing device connected with the electronic device; the image processing device is used for generating a plurality of control signals by interacting with an object through the three-dimensional input device image information, and transmitting the control signals to the electronic device.
Preferably, the electronic device sets a first program, and the electronic device regulates an item in the first program according to the plurality of control signals transmitted by the image processing device.
Preferably, the electronic device displays an item image information of the item of the first program through the image processing device.
Preferably, when the object interacts with the project image information, the image processing device generates a plurality of interaction control signals and provides the interaction control signals to the electronic device, and the electronic device regulates and controls the project of the first program according to the interaction control signals.
Preferably, the image processing apparatus further displays at least one switch key image information, and the image processing apparatus generates a switch signal to display another three-dimensional input device image information through interaction of the at least one switch key image information with the object.
Preferably, the three-dimensional input device image information includes a mouse device image information, a digital keyboard image information, a touch panel device image information, or a handwriting panel device image information.
Preferably, the item image information includes a plurality of item sub-image information, and the plurality of three-dimensional coordinate values of the object and the plurality of first three-dimensional coordinate values of one of the plurality of item sub-image information determine a plurality of second three-dimensional coordinate values of one of the plurality of item sub-image information.
Preferably, the plurality of second three-dimensional coordinate values of one of the plurality of item sub-image information are different from the plurality of first three-dimensional coordinate values of one of the plurality of item sub-image information.
Preferably, the image processing device displays a three-dimensional item image information of the item image information on one side of the image processing device, the three-dimensional item image information rotates according to a predetermined angular velocity, and the object interacts with the rotating three-dimensional item image information to enable the three-dimensional item image information to be deformed, and the deformed three-dimensional item image information is generated.
The invention also discloses an image processing device, which is characterized by comprising: a controller; a display module; a sensor; the communication module is electrically connected with the display module, the sensor and the controller; the sensor detects a plurality of coordinate values of at least one preset detection point of an object and records the coordinate values as a track; the sensor detects and records a plurality of tracks of the object, and the controller displays the tracks of the object in a space at one side of the image processing device so as to form three-dimensional project image information; the sensor detects an interaction action between the object and the three-dimensional project image information so as to adjust the three-dimensional project image information.
Preferably, the image processing device further comprises a storage module electrically connected with the controller, a program is arranged in the storage module, the image processing device displays a plurality of function buttons corresponding to the plurality of function programs of the program in a space at one side of the image processing device, and the object interacts with the plurality of function buttons to adjust the three-dimensional project image information.
The image processing system and the image processing device provided by the invention can complete the image processing program on a computer generally, and can directly display two-dimensional or three-dimensional image information in the space above the image processing device. The user can directly interact with the displayed three-dimensional image information, and the three-dimensional image information is controlled and adjusted by means of gestures and audio information.
For a further understanding of the nature and the technical aspects of the present invention, reference should be made to the following detailed description of the invention and the accompanying drawings, which are provided for purposes of reference only and are not intended to limit the invention.
Drawings
Fig. 1 is a schematic diagram of an image processing system according to a first embodiment of the present invention.
Fig. 2 is a functional block diagram of an image processing system according to a first embodiment of the present invention.
Fig. 3 is a schematic structural view of an image processing apparatus according to a first embodiment of the present invention.
Fig. 4 is a schematic diagram of an image processing system for providing image information of another three-dimensional input device according to a first embodiment of the present invention.
Fig. 5 is a schematic diagram of an image processing system displaying project image information according to a first embodiment of the present invention.
FIG. 6 is a diagram illustrating an image processing system displaying project image information and interacting with an object according to a first embodiment of the present invention.
Fig. 7 is a schematic diagram showing the image processing system of the first embodiment of the present invention with the project image information displayed as an exploded view.
FIG. 8 is a schematic diagram of the interaction of the project image information and the object according to the present invention.
FIG. 9 is a diagram illustrating real-time interaction between an item of image information and an object according to a second embodiment of the present invention.
FIG. 10 is another diagram illustrating real-time interaction between an item of image information and an object according to a second embodiment of the present invention.
FIG. 11 is a diagram illustrating real-time interaction between an item of image information and an object according to a second embodiment of the present invention.
Fig. 12 to 14 are respectively another schematic diagrams of an image processing system according to a second embodiment of the present invention.
Fig. 15 is a schematic view of an image processing apparatus according to a third embodiment of the present invention.
Fig. 16 is a functional block diagram of an image processing apparatus according to a third embodiment of the present invention.
Fig. 17 is another schematic view of an image processing apparatus according to a third embodiment of the present invention.
Fig. 18 is a schematic diagram of an image processing apparatus according to a fourth embodiment of the present invention.
Fig. 19 is another schematic view of an image processing apparatus according to a fourth embodiment of the present invention.
Fig. 20 is another schematic view of an image processing apparatus according to a fourth embodiment of the present invention.
Fig. 21 is another schematic view of an image processing apparatus according to a fourth embodiment of the present invention.
Fig. 22 is a functional block diagram of an image processing apparatus according to a fourth embodiment of the present invention.
Detailed Description
The following description is given of specific embodiments of the present invention with respect to an "image processing system", and those skilled in the art will appreciate the advantages and effects of the present invention from the disclosure herein. The invention is capable of other and different embodiments and its several details are capable of modification and variation in various respects, all from the point of view and application, all without departing from the spirit of the present invention. The drawings of the present invention are merely schematic illustrations, and are not intended to be drawn to actual dimensions. The following embodiments will further illustrate the related art content of the present invention in detail, but the disclosure is not intended to limit the scope of the present invention. In addition, the term "or" as used herein shall include any one or combination of more of the associated listed items as the case may be.
First embodiment
Referring to fig. 1 and 2, fig. 1 is a schematic diagram of an image processing system according to a first embodiment of the present invention. Fig. 2 is a functional block diagram of an image processing system according to a first embodiment of the present invention.
The image processing system SYS1 includes an electronic device 1 and an image processing device 2. The image processing apparatus 2 is connected to the electronic apparatus 1. The electronic device 1 is a desktop computer, a notebook computer, a tablet computer, a smart phone, a wearable electronic device or a server. In the present embodiment, the electronic device 1 is a notebook computer.
In this embodiment, the electronic device 1 and the image processing device 2 may respectively include a wired communication module or a wireless communication module for performing communication connection.
The image processing device 2 may be used to display two-dimensional image information or to display three-dimensional image information. When the image processing apparatus 2 is used to display two-dimensional image information, a user can view through the upper plane of the image processing apparatus 2 or display the two-dimensional image information above the image processing apparatus 2 as shown in fig. 1 by looking directly at the upper plane of the image processing apparatus 2. When the image processing apparatus 2 is used for displaying three-dimensional image information, the three-dimensional image information is displayed in a space above the image processing apparatus 2 by the light field principle. That is, in the present embodiment, the three-dimensional image information displayed by the image processing apparatus 2 is suspended in the upper space of the image processing apparatus 2. The user can interact with the suspended three-dimensional image information displayed by the image processing device 2 by hand or by other props and tools. The user can directly interact with the three-dimensional image information. Without the need to interact with virtual images in the display device via the wearable display device.
In this embodiment, the image processing apparatus 2 displays a floating three-dimensional input device image information ING. The image processing apparatus 2 can interact with an object OB through the three-dimensional input device image information ING1 to generate a plurality of control signals, and transmit the plurality of control signals to the electronic apparatus 1.
The electronic device 1 includes a first controller 11, a first storage module 12, a first display module 13, a first communication module 14, and a first connection module 15.
The first controller 11 is connected to a first storage module 12, a first display module 13, a first communication module 14, and a first connection module 15. The image processing device 2 includes a second controller 21, a second storage module 22, a second display module 23, a second communication module 24, and a second connection module 25. The second controller 21 is electrically connected to the second storage module 22, the second display module 23, and the second communication module 24.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an image processing apparatus according to a first embodiment of the present invention. The second display module 23 includes a flat display module 231, a lens array layer 232, and a microstructure layer 233. The flat display module 231 has a display surface 2311. The flat display module 231 is used for providing two-dimensional image information, and displaying three-dimensional image information through the lens array layer 232 and the microstructure layer 233. The lens array layer 232 is disposed on the display surface 2311 of the flat display module 231, and receives light of two-dimensional image information for adjustment, and the lens array layer 232 includes a plurality of lenses 2321. The lens array layer 232 is used to manipulate the light field.
The microstructure layer 233 is disposed on the lens array layer 232. The microstructure layer 233 includes a plurality of microstructures. The microstructure layer 233 is used to modulate the angular direction of light. The microstructure layer 33 receives the light transmitted through the lens array layer 232 to adjust so that the second image display device 3 displays three-dimensional image information floating in the space on one side of the second display module 23.
The lenses 2321 of the lens array layer 232 conform to the mirror builder formula: 1/f= (n-1) (1/r1+1/R2), where R1 and R2 are radii of curvature on both sides of the lens 2321, f is the lens focal length, and n is the lens refractive index, respectively.
In addition, the microstructure layer 233 is a dynamic optical element layer. That is, the microstructure layer 233 has a microstructure functional mode and a non-microstructure functional mode. The microstructure layer 233 can be adjusted to a microstructure functional mode or a non-microstructure functional mode according to a control signal. When the microstructure layer 233 is switched to the microstructure functional mode, it can be used to modulate the light angle direction. When the micro-structured layer 233 is switched to the micro-structure-free functional mode, the second image display device 23 is used to display two-dimensional image information, and three-dimensional image information floating in a space on one side of the second image display device 3 is not displayed. The micro-structure layer 233 is a liquid crystal lens array layer (Liquid Lens Layer) comprising a plurality of micro-structure liquid crystal lenses, and the lens array layer is also a liquid crystal lens array layer (Liquid Lens Layer) comprising a plurality of optically modifying liquid crystal lenses. That is, the second image display device 23 is a display device including at least two liquid crystal lens array layers, and in other embodiments, the second image display device 3 may include more liquid crystal lens array layers for light adjustment. Furthermore, in other embodiments, the second image display device 23 may set different numbers of second image display devices 3 in different areas for light adjustment. In addition, the lens array layer 232 includes a first structure lens array layer (not shown) and a second structure lens array layer (not shown). The first structural lens array layer (not shown) has a lens structure that can be used to manipulate the light field, and the second structural lens array layer (not shown) does not have a lens structure.
Referring to fig. 4, fig. 4 is a schematic diagram of an image processing system for providing image information of another three-dimensional input device according to a first embodiment of the present invention.
In this embodiment, when the image processing apparatus 2 displays a three-dimensional input device image information ING1, the image processing apparatus 2 may also display a switch key image information SWG. The image processing apparatus 2 may generate the switching signal by interacting the user or the object OB with the switching key image information SWG. The image processing apparatus 2 transmits a switching signal to the electronic apparatus 1, and the electronic apparatus 1 displays another three-dimensional input device image information ING1 according to the switching signal, for example: three-dimensional keyboard device image information.
Referring to fig. 5, 6, 7 and 8, fig. 5 is a schematic diagram of an image processing system for displaying project image information according to a first embodiment of the present invention. FIG. 6 is a diagram illustrating an image processing system displaying project image information and interacting with an object according to a first embodiment of the present invention. Fig. 7 is a schematic diagram showing the image processing system of the first embodiment of the present invention with the project image information displayed as an exploded view. FIG. 8 is a schematic diagram of the interaction of the project image information and the object according to the present invention.
In this embodiment, the image processing system SYS1 further includes a sensor 3 disposed on the electronic device 1 or the image processing device 2. The sensor 3 may also be provided independently. The sensor 3 is a visible light image acquisition unit, a thermal imaging image acquisition unit, an infrared image acquisition unit or an ultrasonic detection unit. The sensor 3 is configured to acquire a plurality of three-dimensional coordinate values CV2 of the object OB and a plurality of three-dimensional coordinate values CV1 of the item video information PG 1.
Object OB is a user's hand, a tool, or a tool. That is, the user can directly contact with the three-dimensional image information by using his limbs, pen-shaped prop, or the like.
In addition, the image processing system SYS1 further includes a microphone MP and a speaker SP, and the microphone MP and the speaker SP are connected to the electronic device 1 and the image processing device 2. The user can control and interact with the image information ING1 of the first dimension input device by using the audio control signal through the microphone MP and the speaker SP.
The three-dimensional input device image information ING1 is a mouse device image information, a keyboard image information, a touch panel device image information, or a handwriting panel device image information.
The image processing apparatus 2 may generate a plurality of control signals through interaction between the three-dimensional input device image information ING1 and the object OB, and transmit the plurality of control signals to the electronic apparatus 1.
The user can press the image information ING1 of the three-dimensional input device by using the hand, and the sensor 3 detects a plurality of three-dimensional coordinate values CV2 of the hand of the user, such as the coordinate values of the finger tip and the coordinate values of the finger joints, so as to determine the interaction condition between the hand (object OB) of the user and the image information ING1 of the three-dimensional input device. When the user clicks the left or right key of the image information ING1 of the three-dimensional input device, the sensor 3 detects that the index finger of the user performs a clicking action, and the sensor 3 provides the coordinate value change rate of the index finger of the user to the image processing apparatus 2 to determine and correspondingly provides a clicking control signal (click control signal) to the electronic apparatus 1 according to the coordinate value change rate of the index finger of the user.
The electronic device 1 is provided with a first program. The first program may be a 3D drawing program or a game program. The electronic device 1 regulates and controls the first program according to the plurality of control signals transmitted by the image processing device 2, for example, adjusts or controls an item in the first program.
Assume that the first program is a 3D drawing program and the three-dimensional input device image information ING1 is mouse device image information. The user can adjust or control the items in the 3D drawing program by clicking or moving the image information ING of the three-dimensional input device.
When the items in the first program are constructed or drawn on the two-dimensional image information or the three-dimensional image information, the item image information PG1-2D of the items in the first program can be displayed by the image processing apparatus 2. That is, the image processing apparatus 2 may serve as a display device of the item video information PG1-2D of the item in the first program. Since the image processing apparatus 2 can display three-dimensional image information as well as two-dimensional image information, the item image information PG1-2D (image information displayed on the electronic apparatus 1) in the first program can be directly displayed in the three-dimensional form or in the two-dimensional form in the upper space of the image processing apparatus 2.
The user may interact with the item video information PG1 of the image processing apparatus 2. First, the user can perform an operation of enlarging and reducing the item video information PG1 by different gestures. For example, the distance between the thumb and the index finger is enlarged, and the item image information PG1 is enlarged. Or the distance between thumb and index finger is reduced to reduce the project image information PG 1. In addition, the user can also use two hands to pull two corners of the three-dimensional project image information PG1 respectively to perform obvious zooming-in or zooming-out actions.
Further, the user may perform various angular rotations on the item video information PG1 by using gestures. Such as 180 degrees left rotation, 180 degrees up rotation, 45 degrees right rotation, etc. As shown in fig. 5 and 6.
In addition, when the project image information PG1 comprises a plurality of project sub-image information PG1-1-PG1-2, the plurality of three-dimensional coordinate values CV2 of the object OB can interact with the plurality of first three-dimensional coordinate values of one of the project sub-image information PG1-1-PG1-2 to determine a plurality of second three-dimensional coordinate values of one of the project sub-image information.
As shown in fig. 7, when the item video information PG1 includes a plurality of item sub-video information PG1-1-PG1-2, the plurality of item sub-video information PG1-1-PG1-2 can be separated from each other. The user can use his/her hand or prop to make the sub-image information PG1-1-PG1-2 of multiple items displayed in a separated manner, such as an explosion diagram in three dimensions. The user may also process the sub-video information PG1-1 or the sub-video information PG1-2 separately by disassembling the sub-video information PG 1.
The user can select and move one of the plurality of item sub-image information PG1-1-PG1-2 by using a mode of moving an actual object, so that the three-dimensional coordinate value of one of the plurality of item sub-image information PG1-1-PG1-2 is changed, and the plurality of item sub-image information PG1-1-PG 2 is moved from a plurality of first three-dimensional coordinate values to a plurality of second three-dimensional coordinate values. In addition, the plurality of first three-dimensional coordinate values of one of the plurality of item sub-picture information PG1-1-PG1-2 are different from the corresponding plurality of second three-dimensional coordinate values.
In addition, referring to fig. 8, the user may also perform an adjustment operation of enlarging, reducing, or further adjusting the shape of the item image information PG1, the item sub-image information PG1-1, or the item sub-image information PG 1-2. The adjustment result of the user interacting with the three-dimensional item image information PG1 may be stored in the item of the first program of the electronic device 1.
In this embodiment, the first controller 11 and the second controller 21 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an image processor (GPU) or a Microprocessor (MCU). The first storage module 12 and the second storage module 22 are a flash memory, a rom, a eeprom, an eeprom, or an eeprom.
The first communication module 14 and the second communication module 24 may include a wired communication unit (not shown) and a wireless communication unit (not shown), respectively. The wired communication units (not shown) of the first communication module 14 and the second communication module 24 may be independently configured to be communicatively connected to the server 9, and receive a control signal of the server 9 or data in a database (not shown) of the server 9. The wireless communication units (not shown) of the first communication module 14 and the second communication module 24 may be a Wi-Fi communication unit, a bluetooth communication unit, a Zigbee communication unit (Zigbee), a LoRa communication unit, a Sigfox communication unit, or an NB-IoT communication unit. In this embodiment, the sensor 3 is an image acquisition unit (CCD). The first DISPLAY module 13 and the second DISPLAY module 23 may be a liquid crystal DISPLAY, a micro light emitting diode DISPLAY (micro led DISPLAY), or an organic light emitting diode DISPLAY (OLED DISPLAY). The first connection module 15 may be a type-A USB connector, a lighting connector, a type-C USB connector or a micro-USB connector.
In addition, in the present embodiment, the two-dimensional image information or the three-dimensional image information may be calculated in the electronic device 1, the image processing device 2 or the server 9, which is not limited in the present invention. The calculation performed on the server 9 is performed by the remote server 9, and then the two-dimensional image information or the three-dimensional image information is transmitted to the electronic device 1 or the image processing device 2.
Second embodiment
Referring to fig. 9, 10 and 11, fig. 9 is a schematic diagram illustrating real-time interaction between an item of image information and an object according to a second embodiment of the present invention. FIG. 10 is another diagram illustrating real-time interaction between an item of image information and an object according to a second embodiment of the present invention. FIG. 11 is a diagram illustrating real-time interaction between an item of image information and an object according to a second embodiment of the present invention.
In the present embodiment, the electronic apparatus 1 and the image processing apparatus 2 can display a piece of clay image information TG on one side of the image processing apparatus 2. The electronic device 1 can display two-dimensional clay image information TG-2D. The image processing apparatus 2 can display the three-dimensional clay image information TG. The electronic device 1 or the image processing device 2 can rotate the clay image information TG according to the manner of manufacturing the clay or the porcelain. That is, the clay image information TG (three-dimensional project image information) is rotated according to a predetermined angular velocity. The predetermined angular velocity may be adjusted during the manufacturing process according to the user's needs.
As shown in fig. 9, the user may interact with the clay image information TG by using both hands to deform the clay image information TG to generate deformed clay image information TG, that is, deformed three-dimensional item image information. That is, the user can interact with the clay image information TG by hand using the image processing system of the present embodiment to complete a hand-made model.
In the present embodiment, the deformation state of the clay image information TG is determined according to the characteristics of rigidity, viscosity, and the like of the clay itself, and the posture of the user's hand.
Further, in the present embodiment, the electronic device 1 and the image processing device 2 calculate the corresponding change of the clay image information TG according to the coordinate values of the surface of the user's finger and the coordinate values of the surface of the user's finger or the change degree (the change rate of the coordinate per second) of the coordinate values of the surface of the user's hand and the angular velocity of the clay image information TG when rotating.
In this embodiment, the electronic device 1 or the image processing device 2 can use a sensor 3 to detect a coordinate value of a predetermined detection point of the object OB (user's hand), such as a user's index finger, at least one predetermined coordinate value of the electronic interactive glove ECL, and a predetermined coordinate value of the pen point of the pen-type interactive component PT. The sensor 3 continuously records a moving track of the predetermined detection point of the object OB within a predetermined time. The sensor 3 transmits a plurality of coordinate values of a moving track of a predetermined detection point of the object OB within a predetermined time to a controller (not shown) of the electronic device 1 or a controller (not shown) of the image processing device 2. The controller (not shown) of the electronic device 1 or the controller (not shown) of the image processing device 2 further displays the movement track and the clay image information TG through the second display module. That is, the user can directly interact with the clay image information TG by using the image processing apparatus 2, and even draw lines, planar patterns, or stereoscopic patterns provided on the surface of the clay image information TG.
The user may also draw a plurality of tracks in the space on one side of the image processing apparatus 2 using a pen-type interactive element or an electronic interactive glove. And then moving, adjusting or connecting the tracks, and setting the tracks on the surface of the clay image information TG.
Further, the second program in the image processing apparatus 2 includes a plurality of function processing programs corresponding to the plurality of three-dimensional function buttons FBN. The plurality of three-dimensional function buttons FBN are displayed in the space on the side of the image processing apparatus 2 by the second display module 23″ in the same manner as the clay image information.
The user can select one of the plurality of three-dimensional function buttons FBN, for example, color, zoom, adjust the texture, etc., and process a partial area or a whole area of the clay image information TG.
That is, the user can selectively process the planar area, the stereoscopic area, or the entire area formed by the plurality of tracks.
Further, the image processing apparatus 2 may be communicatively connected to a server (not shown) through a communication module (not shown). A server (not shown) may take over the calculation of a number of trajectories of the three-dimensional project image information 3DG. That is, in the present embodiment, the two-dimensional image information and the three-dimensional image information may be processed by the image processing apparatus 2, or may be cloud-calculated by a remote server (not shown), and the calculated two-dimensional image information and three-dimensional image information may be transmitted to the image processing apparatus 2.
Referring to fig. 12 to 14, the image processing system SYS1 can be shown in fig. 12 to 14, and the electronic device 1 is a smart phone MD connected to the image processing device 2. Furthermore, the electronic apparatus 1 may further include: a tablet computer, a desktop computer, or a smart watch. Furthermore, as shown in fig. 12 to 14, the user can interact with and control the three-dimensional image information of the image processing device 2 by using the finger OB, the pen-type interaction component PT, the electronic interaction glove ECL, and the like.
Third embodiment
Referring to fig. 15, 16 and 17, fig. 15 is a schematic diagram of an image processing apparatus according to a third embodiment of the present invention. Fig. 16 is a functional block diagram of an image processing apparatus according to a third embodiment of the present invention. Fig. 17 is another schematic view of an image processing apparatus according to a third embodiment of the present invention.
Referring to fig. 15 and 16, the image processing apparatus 2 'includes a second controller 21', a second storage module 22', a second display module 23', a second communication module 24', and a second connection module 25'. The second controller 21' is electrically connected to the second storage module 22', the second display module 23', the second communication module 24', and the second connection module 25'.
The second display module 23' can display two-dimensional image information and three-dimensional image information.
In this embodiment, the second controller 21' may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an image processor (GPU), or a Microprocessor (MCU).
The second storage module 22' is a flash memory, a ROM, a EEPROM, an electrically rewritable ROM, an erasable and regulatable ROM or an electrically erasable and regulatable ROM.
The second communication module 24' may include a wired communication unit (not shown) and a wireless communication unit (not shown), respectively. The wired communication units (not shown) of the second communication module 24' may also be provided independently. The wireless communication unit (not shown) of the second communication module 24' may be a Wi-Fi communication unit, a bluetooth communication unit, a Zigbee communication unit (Zigbee), a LoRa communication unit, a Sigfox communication unit, or an NB-IoT communication unit. In this embodiment, the sensor 3 is an image acquisition unit (CCD). The second DISPLAY module 23' may be a liquid crystal DISPLAY, a micro light emitting diode DISPLAY (micro led DISPLAY), or an organic light emitting diode DISPLAY (OLED DISPLAY).
The second connection module 25' may be a type-A USB connector, a Lightning connector, a type-C USB connector or a micro-USB connector.
In addition, the second display module 23' further includes a touch processing unit (not shown) that can receive a plurality of touch signals.
Furthermore, a second program is set in the image processing apparatus 2'. The second program is an image processing program for processing two-dimensional image information or three-dimensional image information.
First, the user can control the second program of the image processing apparatus 2 'to process the image information by using a touch mode or an external input device (a mouse device, a keyboard device, a stylus pen, a drawing board, etc.) connected to the second connection module 25'.
In addition, the image processing apparatus 2' may display two-dimensional image information or three-dimensional image information corresponding to the plurality of processing keys for processing image information in the second program. The image processing apparatus 2 'may display the three-dimensional video information of the item video information PG' processed by the second program.
As shown in fig. 16, the item image information PG' is a three-dimensional image information. In this embodiment, when the project video information PG ' is displayed as three-dimensional video information by the image processing apparatus 2', the user can adjust the content of the project video information PG ' by using his/her hands or props. At this time, the user adjusts the content of the item video information PG ', and the adjusted content of the item video information PG' is stored in the second storage module 22 'of the image processing apparatus 2'. The image processing apparatus 2' can transmit the data of the project image information PG ' to a server 9 through the second communication module 24 '.
In the present embodiment, the image processing apparatus 2' may further include a microphone MP and a speaker SP. The microphone MP and the speaker SP are connected to the image processing apparatus 2'.
The user can control and interact with the two-dimensional image information or the three-dimensional image information of the image processing device 2' by using the audio control signal through the microphone MP and the speaker SP.
In addition, the image processing apparatus 2' may be communicatively connected to a server (not shown) through a communication module (not shown). A server (not shown) may take over the computation of a number of three-dimensional project image information. That is, in the present embodiment, the two-dimensional image information and the three-dimensional image information may be processed by the image processing device 2', or may be cloud-calculated by a remote server (not shown), and the calculated two-dimensional image information and three-dimensional image information may be transmitted to the image processing device 2'.
Fourth embodiment
Referring to fig. 18, 19, 20, 21 and 22, fig. 18 is a schematic diagram of an image processing apparatus according to a fourth embodiment of the present invention. Fig. 19 is another schematic view of an image processing apparatus according to a fourth embodiment of the present invention. Fig. 20 is another schematic view of an image processing apparatus according to a fourth embodiment of the present invention. Fig. 21 is another schematic view of an image processing apparatus according to a fourth embodiment of the present invention. Fig. 22 is a functional block diagram of an image processing apparatus according to a fourth embodiment of the present invention.
In the present embodiment, the image processing apparatus 2 "includes a second controller 21", a second storage module 22", a second display module 23", a second communication module 24", a second connection module 25" and a sensor 26". The second controller 21 "is electrically connected to the second storage module 22", the second display module 23", the second communication module 24" and the second connection module 25 "and the sensor 26".
The second display module 23″ can display a two-dimensional image information and a three-dimensional image information.
In this embodiment, the second controller 21″ may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an image processor (GPU), or a Microprocessor (MCU).
The second storage module 22″ is a flash memory, a rom, a regularized rom, an electrically rewritable rom, an erasable regularized rom, or an electrically erasable regularized rom.
The second communication module 24″ may include a wired communication unit (not shown) and a wireless communication unit (not shown), respectively. The wired communication units (not shown) of the second communication module 24 "may also be provided independently. The wireless communication unit (not shown) of the second communication module 24″ may be a Wi-Fi communication unit, a bluetooth communication unit, a Zigbee communication unit (Zigbee), a LoRa communication unit, a Sigfox communication unit, or an NB-IoT communication unit. In this embodiment, the sensor 26 "is a video acquisition unit (CCD). The second DISPLAY module 23″ may be a liquid crystal DISPLAY, a micro light emitting diode DISPLAY (micro light emitting diode) or an organic light emitting diode DISPLAY (OLED DISPLAY).
The second connection module 25″ may be a type-A USB connector, a Lightning connector, a type-C USB connector, or a micro-USB connector.
In the present embodiment, the image processing apparatus 2″ may further include a microphone MP and a speaker SP. The microphone MP and the speaker SP are connected to the image processing apparatus 2". The user can control and interact with the two-dimensional image information or the three-dimensional image information of the image processing device 2″ through the microphone MP and the speaker SP by using the audio control signal.
Furthermore, a second program is set in the image processing apparatus 2". The second program is an image processing program for processing two-dimensional image information or three-dimensional image information.
In the present embodiment, the sensor 26″ is configured to detect a coordinate value of a predetermined detection point of the object OB (hand of the user), such as at least one predetermined coordinate value of the index finger of the user, the ECL of the electronic interactive glove, and a predetermined coordinate value of the pen point of the pen-type interactive assembly PT. The sensor 26 "continuously records a moving track of the predetermined detection point of the object OB within a predetermined time. The sensor 26 "transmits a plurality of coordinate values of the movement locus of the predetermined detection point of the object OB for a predetermined time to the second controller 2". The second controller 2 "displays the movement path PH1 in the space on the side of the image processing apparatus 2" through the second display module 23 ". That is, the user can directly draw lines, planes, and stereoscopic patterns provided in the space using the image processing device 2".
The user may draw a plurality of trajectories PH1-PHN in the space on one side of the image processing device 2 "using the hand OB, the pen-type interaction component PT, and the electronic interaction glove ECL. The tracks are then moved, adjusted or linked to create a three-dimensional project image information 3DG.
Further, the second program in the image processing apparatus 2″ includes a plurality of function processing programs corresponding to the plurality of three-dimensional function buttons FBN. The plurality of three-dimensional function buttons FBN are displayed in the space on the side of the image processing apparatus 2″ by the second display module 23″ in the same manner as the three-dimensional item video information 3DG.
The user can select one of the plurality of three-dimensional function buttons FBN, for example, color, zoom, adjust the texture, etc., and process a partial area or an entire area of the three-dimensional item image information 3DG.
That is, the user can selectively process the planar area, the stereoscopic area, or the entire area formed by the plurality of tracks. Similar to the previous embodiment, the three-dimensional project image information 3DG may be displayed in the space on one side of the image processing apparatus 2″ in the form of an exploded view.
Further, the image processing apparatus 2″ may be communicatively connected to a server (not shown) through the second communication module 24″. A server (not shown) may take over the calculation of a number of trajectories of the three-dimensional project image information 3DG. In this embodiment, the two-dimensional image information and the three-dimensional image information may be processed by the image processing apparatus 2″, or may be cloud-calculated by a remote server (not shown), and the calculated two-dimensional image information and three-dimensional image information may be transmitted to the image processing apparatus 2″.
Advantageous effects of the embodiment
The image processing system and the image processing device provided by the invention can complete the image processing program on a computer generally, and can directly display two-dimensional or three-dimensional image information in the space above the image processing device. The user can directly interact with the displayed three-dimensional image information, and the three-dimensional image information is controlled and adjusted by means of gestures and audio information.
The above disclosure is only a preferred embodiment of the present invention and is not intended to limit the claims of the present invention, so that all equivalent technical changes made by the application of the specification and the drawings of the present invention are included in the claims of the present invention.

Claims (9)

1. An image processing system, comprising:
an electronic device; and
an image processing device connected with the electronic device;
the image processing device is used for generating a plurality of control signals by interacting with an object through the three-dimensional input device image information and transmitting the control signals to the electronic device;
the electronic device is provided with a first program, and the electronic device regulates and controls an item in the first program according to the plurality of control signals transmitted by the image processing device;
the electronic device displays project image information of the project of the first program through the image processing device, wherein the project image information is floating three-dimensional image information, and the object interacts with the project image information to adjust a plurality of actions of the project image information;
when the object interacts with the project image information, the image processing device detects a plurality of coordinate values of at least one preset detection point of the object, records the coordinate values as a track, generates a plurality of interaction control signals according to interaction states of the tracks of the object and the project image information and provides the interaction control signals for the electronic device, and the electronic device adjusts or regulates part or all of the project image information displayed above the image processing device according to the interaction control signals.
2. The image processing system of claim 1, wherein when the object interacts with the project image information, the image processing device generates a plurality of interaction control signals and provides the interaction control signals to the electronic device, and the electronic device regulates the project of the first program according to the interaction control signals.
3. The image processing system of claim 2, wherein the image processing device further displays at least one switch key image information, and wherein the image processing device generates a switch signal to display another three-dimensional input device image information by interaction of the at least one switch key image information with the object.
4. The image processing system of claim 3, wherein the three-dimensional input device image information comprises a mouse device image information, a keyboard image information, a touch pad device image information, or a tablet device image information.
5. The image processing system of claim 4, wherein the item image information includes a plurality of item sub-image information, and wherein the plurality of three-dimensional coordinate values of the object and the plurality of first three-dimensional coordinate values of one of the plurality of item sub-image information determine a plurality of second three-dimensional coordinate values of one of the plurality of item sub-image information.
6. The image processing system of claim 5, wherein the plurality of second three-dimensional coordinate values of one of the plurality of item sub-image information are different from the plurality of first three-dimensional coordinate values of one of the plurality of item sub-image information.
7. An image processing system, comprising:
an electronic device; and
an image processing device connected with the electronic device;
the image processing device is used for generating a plurality of control signals by interacting with an object through the three-dimensional input device image information and transmitting the control signals to the electronic device;
the image processing device displays three-dimensional project image information of project image information on one side of the image processing device, the three-dimensional project image information rotates according to a preset angular velocity, and the object interacts with the rotating three-dimensional project image information to enable the three-dimensional project image information to deform and generate deformed three-dimensional project image information.
8. An image processing apparatus, comprising:
a controller;
a display module;
a sensor; and
the communication module is electrically connected with the display module, the sensor and the controller;
the sensor detects a plurality of coordinate values of at least one preset detection point of an object and records the coordinate values as a track;
the sensor detects and records a plurality of tracks of the object, and the controller displays the tracks of the object in a space at one side of the image processing device so as to form three-dimensional project image information;
the sensor detects an interaction action of the object and the three-dimensional project image information so as to regulate and control part or all of the three-dimensional project image information;
the image processing device is connected with an electronic device, the image processing device displays floating three-dimensional input equipment image information, the image processing device interacts with the object through the three-dimensional input equipment image information to generate a plurality of control signals, the control signals are transmitted to the electronic device, and the electronic device regulates and controls an item in a first program according to the control signals transmitted by the image processing device;
the electronic device displays the project image information of the project of the first program through the image processing device, wherein the project image information is floating three-dimensional image information, and the object interacts with the project image information to adjust a plurality of actions of the project image information;
when the object interacts with the project image information, the image processing device detects a plurality of coordinate values of at least one preset detection point of the object, records the coordinate values as the track, generates a plurality of interaction control signals according to interaction states of the tracks of the object and the project image information and provides the interaction control signals for the electronic device, and the electronic device adjusts or regulates part or all of the project image information displayed above the image processing device according to the interaction control signals.
9. The image processing apparatus according to claim 8, further comprising a storage module electrically connected to the controller, wherein a program is set in the storage module, the image processing apparatus displays a plurality of function buttons corresponding to the plurality of function programs of the program in a space on one side of the image processing apparatus, and the object interacts with the plurality of function buttons to adjust the three-dimensional project image information.
CN202011261381.0A 2020-11-12 2020-11-12 Image processing system and image processing apparatus Active CN114489315B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011261381.0A CN114489315B (en) 2020-11-12 2020-11-12 Image processing system and image processing apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011261381.0A CN114489315B (en) 2020-11-12 2020-11-12 Image processing system and image processing apparatus

Publications (2)

Publication Number Publication Date
CN114489315A CN114489315A (en) 2022-05-13
CN114489315B true CN114489315B (en) 2023-12-19

Family

ID=81490858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011261381.0A Active CN114489315B (en) 2020-11-12 2020-11-12 Image processing system and image processing apparatus

Country Status (1)

Country Link
CN (1) CN114489315B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06215150A (en) * 1993-01-18 1994-08-05 Toshiba Corp Three-dimensional image display device
CN102736728A (en) * 2011-04-11 2012-10-17 宏碁股份有限公司 Control method and system for three-dimensional virtual object and processing device for three-dimensional virtual object
US9411511B1 (en) * 2013-09-19 2016-08-09 American Megatrends, Inc. Three-dimensional display devices with out-of-screen virtual keyboards
CN106873760A (en) * 2015-12-14 2017-06-20 技嘉科技股份有限公司 Portable virtual reality system
WO2020012711A1 (en) * 2018-07-10 2020-01-16 オムロン株式会社 Input device
CN111722769A (en) * 2020-07-16 2020-09-29 腾讯科技(深圳)有限公司 Interaction method, interaction device, display equipment and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5866177B2 (en) * 2011-11-10 2016-02-17 ソニー株式会社 Image processing apparatus and image processing method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06215150A (en) * 1993-01-18 1994-08-05 Toshiba Corp Three-dimensional image display device
CN102736728A (en) * 2011-04-11 2012-10-17 宏碁股份有限公司 Control method and system for three-dimensional virtual object and processing device for three-dimensional virtual object
US9411511B1 (en) * 2013-09-19 2016-08-09 American Megatrends, Inc. Three-dimensional display devices with out-of-screen virtual keyboards
CN106873760A (en) * 2015-12-14 2017-06-20 技嘉科技股份有限公司 Portable virtual reality system
WO2020012711A1 (en) * 2018-07-10 2020-01-16 オムロン株式会社 Input device
CN111722769A (en) * 2020-07-16 2020-09-29 腾讯科技(深圳)有限公司 Interaction method, interaction device, display equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
一种基于视觉的手指与全息影像交互研究;于瀛洁;李雨浪;郑华东;;激光与红外(04);全文 *

Also Published As

Publication number Publication date
CN114489315A (en) 2022-05-13

Similar Documents

Publication Publication Date Title
US11099655B2 (en) System and method for gesture based data and command input via a wearable device
TWI492146B (en) Virtual hand based on combined data
Sridhar et al. Watchsense: On-and above-skin input sensing through a wearable depth sensor
US11922590B2 (en) Devices, methods, and graphical user interfaces for interacting with three-dimensional environments
CN114080585A (en) Virtual user interface using peripheral devices in an artificial reality environment
US20200310561A1 (en) Input device for use in 2d and 3d environments
KR101620777B1 (en) Enhanced virtual touchpad and touchscreen
US11853527B2 (en) Devices, methods, and graphical user interfaces for providing computer-generated experiences
US20100128112A1 (en) Immersive display system for interacting with three-dimensional content
EP2278823A2 (en) Stereo image interaction system
Tarun et al. Snaplet: using body shape to inform function in mobile flexible display devices
Menzner et al. Above surface interaction for multiscale navigation in mobile virtual reality
CN114489315B (en) Image processing system and image processing apparatus
TWI757941B (en) Image processing system and image processing device
JP6008904B2 (en) Display control apparatus, display control method, and program
Athira Touchless technology
Nakazato et al. A desktop 3D modeling system controllable by mid-air interactions
Issartel et al. Analysis of locally coupled 3d manipulation mappings based on mobile device motion
TW201913298A (en) Virtual reality system capable of showing real-time image of physical input device and controlling method thereof
CN114374831A (en) Interactive system
Ducher Interaction with augmented reality
Nguyen 3DTouch: Towards a Wearable 3D Input Device for 3D Applications
Cheng Direct interaction with large displays through monocular computer vision
Ren Designing for Effective Freehand Gestural Interaction
Herzner Organic User Interfaces

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant