CN102769733B - Portable equipment - Google Patents

Portable equipment Download PDF

Info

Publication number
CN102769733B
CN102769733B CN201210244282.0A CN201210244282A CN102769733B CN 102769733 B CN102769733 B CN 102769733B CN 201210244282 A CN201210244282 A CN 201210244282A CN 102769733 B CN102769733 B CN 102769733B
Authority
CN
China
Prior art keywords
face
data
image
camera
communication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210244282.0A
Other languages
Chinese (zh)
Other versions
CN102769733A (en
Inventor
尾方利广
关川雄介
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aozhixin Digital Technology Co ltd
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Publication of CN102769733A publication Critical patent/CN102769733A/en
Application granted granted Critical
Publication of CN102769733B publication Critical patent/CN102769733B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention provides a portable equipment which is characterized by including: an imaging portion (1) which shoots objects and generates image data; a display portion (15) which displays image data; a communication portion (19) which is communicated with an external equipment; a transmission processing portion (243) which transmits image data displayed on the display portion (15) via the communication portion (19) to the external equipment for processing; and a display processing portion (20) which acquires information stored in the external equipment and relative to the face portion via the communication portion (19) and updates display of the display portion (15) according to acquired information relative to the face portion.

Description

Portable equipment
The divisional application of the application's to be original bill application number be application for a patent for invention (applying date: on March 11st, 2010, denomination of invention: portable equipment) of No.201010121602.4.
Technical field
The present invention relates to the portable equipment that external device sends content-data.
Background technology
All the time, at large the view data captured by the portable equipments such as the mobile phone of having camera is sent to other external equipment such as portable equipment or personal computer.In addition, in Japan Patent No. 4077924 publication, disclose a kind of image data management device, this image data management device in advance by transmission source information and with external device communication and the view data that receives stores accordingly, or by transmission target information and with external device communication and send to the view data of this external equipment to store accordingly.In the image data management device of this Japan Patent No. 4077924 publication, according to the transmission source information corresponding with view data and send target information, delete the view data of the view data that sends from external equipment or once external device transmission in the past.
But, imagine when the image with many people of display such as portable equipment shooting collective photos, the portable equipment sending to the people shown in this image to hold captured view data.Here, when the portable equipment only held with this person carry out communicating send view data, cannot know afterwards and have sent view data to whom.Therefore, user must remember the people that sent to.In addition, the waste situation retransmitted because forgetting in addition.
Summary of the invention
The feature of the portable equipment of the present invention's mode is, this portable equipment has: image pickup part, and it is taken the subject containing face, image data generating; Display part, it shows the described view data generated by described image pickup part; Department of Communication Force, communicates between its with external equipment; Transmission processing portion, it carries out the process via described Department of Communication Force, the view data be presented on described display part being sent to described external equipment; And Graphics Processing portion, it carries out following process, obtain the information relevant to the face of the user of the described external equipment comprised described view data via described Department of Communication Force from described external equipment, and upgrade the display of described display part according to the information relevant to described face that this obtains.
Read following detailed description of the present invention with reference to accompanying drawing, thus foregoing and other object of the present invention, feature, advantage and the meaning technically and in industry can be understood further.
Accompanying drawing explanation
Fig. 1 is the approximate three-dimensional map of the front face side that the digital camera applying portable equipment of the present invention is shown.
Fig. 2 is the approximate three-dimensional map of the rear side that digital camera is shown.
Fig. 3 illustrates that digital camera carries out the figure of the situation communicated with other digital camera.
Fig. 4 is the block diagram of an example of the main inner mechanisms that digital camera is shown.
Fig. 5 is the figure of the data structure example that image file is shown.
Fig. 6 is the figure of the data structure example that communication historical data is shown.
Fig. 7 illustrates that photography character image sends the figure of the data structure example of information.
Fig. 8 is the figure of the communication summary illustrated between digital camera.
Fig. 9 A is another figure of the communication summary illustrated between digital camera.
Fig. 9 B is another figure of the communication summary illustrated between digital camera.
Fig. 9 C is another figure of the communication summary illustrated between digital camera.
Figure 10 is the flow chart that the basic handling order that digital camera carries out is shown.
Figure 11 is the flow chart of the detailed processing sequence that photograph mode process is shown.
Figure 12 is the flow chart of the detailed processing sequence that the process of communication history blotter is shown.
Figure 13 is the flow chart of the detailed processing sequence that data sending processing is shown.
Figure 14 is the flow chart that the detailed processing sequence sending determination processing is shown.
Figure 15 is the flow chart of the detailed processing sequence that face/equipment alignment processing is shown.
Figure 16 is the flow chart of the detailed processing sequence that data receiver process is shown.
Figure 17 is the flow chart of the detailed processing sequence that reproduction mode process is shown.
Figure 18 is the flow chart of the detailed processing sequence that image reproducing process is shown.
Figure 19 illustrates that data unify the flow chart of the detailed processing sequence of transmission processing.
Figure 20 is the figure of the display frame example of the photography character image illustrated in variation 1.
Figure 21 is the figure of the display frame example of the photography character image illustrated in variation 2.
Figure 22 is the flow chart of the detailed processing sequence of the data sending processing illustrated in variation 3.
The figure of display frame example when Figure 23 is the confirmation display of the view data illustrated in variation 3.
Figure 24 illustrates that digital camera carries out the figure of the situation communicated with personal computer.
Embodiment
Below, with reference to accompanying drawing, in detail the preferred embodiment of the present invention is described.Further, the present invention is not limited only to this execution mode.In addition, in the record of each accompanying drawing, identical label is marked to identical part.
Fig. 1 is the approximate three-dimensional map of the front face side that the digital camera 1 applying portable equipment of the present invention is shown, Fig. 2 is the approximate three-dimensional map of the rear side that digital camera 1 is shown.As shown in Figure 1, for digital camera 1, there is phtographic lens 4 in the former configuration of camera body 3, there is in the inside of camera body 3 the image pickup part 11(reference Fig. 4 taken the shot object image by this phtographic lens 4 incidence).On the other hand, on camera body 3, be configured with the release-push 5 being used to indicate right moment for camera.Further, at the back side of camera body 3, be configured with the display part 15 that shows various picture and switch on power and the multiple push-button switches 7 etc. being assigned with inherent function of the various operation such as model selection for inputting.
Further, when connecting power supply, the selection photograph mode of (ON) digital camera 1, digital camera 1 becomes can shooting state.In photograph mode, be presented on display part 15 by the shot object image of phtographic lens 4 incidence in real time as live view image, user, while viewing live view image, presses release-push 5 to take rest image or dynamic image.Or the mode of operation of digital camera 1 is switched to reproduction mode, captured rest image or dynamic image display (reproduction) are appreciated on display part 15.
In addition, digital camera 1 as shown in Figure 1, is built-in with wireless communication part 19 in the pre-position of camera body 3 inside, and it is for carrying out wireless near field communication with external equipments such as other digital camera 1, personal computer, mobile phone, television sets.In the appropriate position of camera body 3 inside, configure this wireless communication part 19, make its antenna towards exterior side.In the example of fig. 1, wireless communication part 19 is configured in a bight of camera body 3 bottom surface side.Like this by wireless communication part 19 being configured in the bight place of device case, user, when carrying out the communication via wireless communication part 19, easily can carry out the location between communication partner device, suppress the communication failure caused by position deviation.
Fig. 3 illustrates that digital camera 1 carries out the figure of the situation communicated with other digital camera 1.In figure 3, a side is set to digital camera 1, the opposing party is set to digital camera 1-2.As shown in Figure 3, when camera body 3 closer to each other in the region that each wireless communication part 19 can intercom mutually time, set up the communication between them.Below, in the present embodiment, the situation using carrying out transceiving data as other digital camera 1 of external equipment one example as communication counterpart is illustrated, by other digital camera 1 of this communication counterpart referred to as " the other side's camera 1 ".
Fig. 4 is the block diagram of an example of the main inner mechanisms that digital camera 1 is shown.As shown in Figure 4, digital camera 1 has: image pickup part 11, image processing part 12, face detection portion 13, operating portion 14, display part 15, clock portion 16, recording unit 17, blotter portion 18, wireless communication part 19 and control part 20 etc.
Image pickup part 11 comprises and carries out opto-electronic conversion to the shot object image by phtographic lens 4 incidence and the CCD(Charge Coupled Device exporting analog electrical signal, charge coupled device) or CMOS(Complementary MetalOxide Semiconductor, complementary metal oxide semiconductors (CMOS)) etc. imaging apparatus, convert the analog electrical signal from imaging apparatus to digital electric signal, output image data.
Image processing part 12 implements various image procossing to the view data exported from image pickup part 11, and carries out converting to the process being suitable for the view data recording with or show.Such as, when the view data of records photographing image or display record view data time, carry out based on JPEG(Joint Photographic Experts Group, Joint Photographic Experts Group) the Image Data Compression process of mode etc. or decompression processing.In addition, the view data that image processing part 12 carries out microfilming image generates the process of the display view data (thumbnail image data) of little image (thumbnail image) or cuts the process etc. of the presumptive area in view data.
This image processing part 12 comprises person recognition portion 121 and face image handling part 123.The face detection result performed view data (face feature information) and the face feature information registering user's (local user) of (record) the machine in recording unit 17 as native data 171 are checked by person recognition portion 121.Further, person recognition portion 121 carries out the process (person recognition process) whether face in recognition image data is the face of local user.In the present embodiment, person recognition portion 121 carries out person recognition process to the view data of the image file received from the other side's camera 1, identifies the face of the local user shown in this view data.Face image handling part 123 carries out following process: when showing photography character image, external equipment (being other digital camera 1 in the present embodiment) in face in view data, that hold to the people that this face is corresponding have sent face's (sending face) of this view data (being actually image file 30) near position, synthesis represent sent transmission mark.
The view data of the 13 pairs of photographss in face detection portion carries out the face detection process as known technology, the region (face area) of the face shown in inspection image data, then extracts the characteristic information (face feature information) of face.Here, the face area of face detection processing example in this way in application model matching detection view data, according to the testing result of this face area, detects each face portion such as eyes, nose, lip, eyebrow.Further, such as, by calculating the relative size relevant to the key element such as eye, nose, lip, eyebrow or distance, from the face image of the personage shown by image, face feature information is extracted.Wherein, because face is towards different from towards this relative distance during side during front, therefore after above-mentioned relative distance, extract face feature information at the slant correction according to face.In addition, the inclination of face can be detected according to the shade etc. of the center of face, eye or nose.Now, if consider shape and the color of each key element, then the inclination of face can more precisely be detected.When comprising multiple face in view data, face feature information can be obtained for each face.
Operating portion 14, for receiving the switching of the mode of operation of the digital cameras 1 such as the instruction of right moment for camera (shooting instruction), photograph mode or reproduction mode, the data of the other side's camera 1 being sent to the various operations of the user such as setting of instruction, photography conditions, notifies operation signal to control part 20.This operating portion 14 comprises release-push 5 and the push-button switch 7 of Fig. 1.
Except rest image, dynamic image and live view image that display part 15 is taken except display, also show the various set informations etc. of digital camera 1, this display part is by LCD(Liquid Crystal Display, liquid crystal display) or the display unit such as EL display (Electroluminescence Display, electroluminescent display) realize.This display part 15 such as redraws live view image according to every 1 frame and dynamically shows continuously in photograph mode, shows captured rest image or dynamic image in reproduction mode.
Clock portion 16 is for carrying out timing to date and moment.According to the current time of this clock portion 16 timing determine take time date and the moment, the view data of photographs is recorded in recording unit 17 together with determined shooting time.
Recording unit 17 is realized by the various semiconductor memories such as the flash memory of renewable record or ROM, the built-in or recording medium such as hard disk, storage card that utilizes data communication terminals to connect and its read-write equipment, and this recording unit uses after suitably can adopting the tape deck corresponding with purposes or combination.Record for making digital camera 1 work and realizing the various camera programs of the various functions that this digital camera 1 has in this recording unit 17; And the data etc. used in this computer program performs.Here, camera program package obtains as photography character image registration process portion 201 described later, counterpart device data the program that handling part 202, communication history blotter handling part 203, data transmit-receive handling part 204, transmission unit 241, communication history list update portion 242, image file transmission processing portion 243, the machine face detection unit 244 and face/equipment alignment processing portion 205 play function containing making control part 20.
In addition, native data 171 is recorded in recording unit 17.This native data 171 comprises device data and the machine face data.Device data for identifying the machine, such as, can be used in the device id distributing to each equipment when manufacturing digital camera 1 inherently.In addition, be not limited to device id, suitably the information of identifiable design digital camera 1 can also be used as device data.On the other hand, the machine face data is from showing the face feature information extracted the view data of the face of local user.This machine face data is registered according to user operation.When being described order, such as, by the face of actual photographed local user or carry out regional choice to the face area of the local user shown in the view data of local user, specify the face image of local user.In response to this, first face detection portion 13 carries out face detection process to specified face image data, extracts face feature information.Then, obtained face feature information is carried out recording processing as the machine face data by control part 20 in recording unit 17.
In addition, multiple image file 30 i.e. image information 173 is recorded in recording unit 17.In the present embodiment, image processing part 12 is utilized to carry out recording the view data of the content-data after with image procossing one example and photographs, image file 30 as the document form that addition of file header is recorded in recording unit 17, the EXIF(Exchangeable ImageFile such as gps data, title, annotations and comments, personal information that this file header obtains when such as comprising shooting time, shooting, exchangeable image file) information.
Fig. 5 is the figure of the data structure example that image file 30 is shown.As shown in Figure 5, image file 30 comprise the photographic informations (EXIF information) 32 such as image ID31, shooting time, the view data 33 of photographs, photographs thumbnail image data 34, detect facial information 35 and communication history list 36.Image ID31 is the identifying information that such as (during shooting) distributes inherently when generating this view data in order to identify each image.Further, except the image file of the photographs of the machine shooting, each image file 30 of composing images information 173 also suitably can comprise and communicating with other digital camera 1 via wireless communication part 19 and the image file that obtains from other digital camera 1.
Detecting facial information 35 is the results this view data 33 being performed to face detection, comprises the face data 350 of each face detected.Such as, when 3 faces being detected and extract face feature information in view data 33,3 face feature information for each face are set as face data 350 respectively.Specifically, in face data 350, by the face feature information gone out for this face extraction be used for identifying that the face of this face numbers correspond to and set.Here, face feature packets of information is containing the face location in this view data 33 and size.In addition, face's numbering is such as the serial number distributed the face detected in this view data 33.Further, in the image file 30 not detecting face in view data 33, do not set detection facial information 35.
Communication history list 36 be up to now for this image file 30 between devices (in the present embodiment, at the digital camera 1 shown in Fig. 3 and between other digital camera 1-2) carry out the historical information of receiving and dispatching.In the present embodiment, whenever receiving and dispatching image file 30, generate communication historical data 360, the communication historical data 360 generated is appended in communication history list 36.Such as, if be set with 3 communication historical data 360 in communication history list 36, then receive and dispatch this image file 30 for three times between the regulation digital camera 1 represented respectively in each communication historical data 360.On the other hand, when not being provided with communication historical data 360 in communication history list 36, between digital camera 1, do not receive and dispatch this image file 30.
Fig. 6 is the figure of the data structure example that communication historical data 360 is shown.As shown in Figure 6, communication historical data 360 comprises transmission source device data 361, sends target equipment data 363, transmitting-receiving time 365.The device data sending the side (digital camera 1 of transmission source) of image file 30 in the transmitting-receiving of the image file 30 when generating this communication historical data 360 is set with in transmission source device data 361.On the other hand, at the device data sending the side (sending the digital camera 1 of target) receiving image file 30 in the transmitting-receiving being set with this image file 30 when generating this communication historical data 360 in target equipment data 363.In addition, in the transmitting-receiving time 365, be set with current time when generating this communication historical data 360.
Further, as shown in Figure 4, in recording unit 17, record photography character image send information 175 and face/device correspondence table 177.
Photography character image sends the tables of data of information 175 for sending to external equipment (being other digital camera 1 in the present embodiment) to manage to the photographs (detecting the photographs of face in its view data: hereinafter referred to as " photography character image ") whether once the machine taken, show face.In more detail, for each photography character image in photography character image transmission information 175, have recorded other digital camera 1 of the equipment whether once being sent to by this photographs the registered people corresponding for face shown in this photography character image to hold.
Fig. 7 illustrates that photography character image sends the figure of the data structure example of information 175.As shown in Figure 7, it is face's numbering and image are sent to indicate that the image ID of the photography character image taken with the machine carries out the tables of data set accordingly that character image of photographing sends information 175.Face's numbering of the face's distribution detected in the view data to respective image ID is set with in face's numbering.It is represent that other digital camera 1 that the people corresponding to face whether numbered corresponding face holds have sent the flag information of the image file 30 of correspondence image ID that image sends mark.Set when sending " ON(is) ", set when not sending " OFF(is no) ".Such as, in the view data 33 of image file 30 " ID00011 " being set as image ID 31, detect 3 faces being assigned face's numbering " 01 " ~ " 03 ".And, as recorded shown in R11, the image of face's numbering " 01 " wherein sends setting " ON " in mark, and image ID31 be that the image file 30 of " ID00011 " has been sent to face and numbers other digital camera 1 that people corresponding to the face of " 01 " hold.On the other hand, the image of numbering " 03 " in face sends setting " OFF " in mark (recording R13), other digital camera 1 that this image file 30 does not also send to people corresponding to the face of face's numbering " 03 " to hold.
Face/device correspondence table 177 is the tables of data setting face image data and the corresponding relation between the device data obtained from other digital camera 1 being registered as the equipment that people corresponding to this face holds obtained by cutting face area in the view data from photography character image.
Blotter portion 18 is such as made up of semiconductor memories such as RAM, as the work memory of control part 20.This blotter portion 18 has temporarily to the storage area that the program of control part 20 execution and the data etc. of use in this program performs keep.Such as, blotter communication history ephemeral data 181.This communication history ephemeral data 181 has the data structure identical with the communication historical data 360 shown in Fig. 6, and this communication historical data 360 comprises transmission source device data, sends target equipment data and transmitting-receiving time.In addition, blotter portion 18 for carrying out blotter etc. to the view data exported from image pickup part 11, the view data of the image (live view image) that this blotter portion 18 blotter is taken into from image pickup part 11 for every 1 frame and the view data etc. of image (photographs) be taken into from image pickup part 11 at right moment for camera.
Wireless communication part 19 is for carrying out wireless near field communication between the other side's camera 1, be made up of transmission circuit etc., this transmission circuit is by receiving and dispatching electric wave signal between antenna 191 and the wireless communication part 19 of the other side's camera 1, carry out demodulation process to received signal, and modulation treatment is carried out to transmission signal.This wireless communication part 19 regularly sends when starting the signal of communication notifying that it exists, and detects the signal of communication sent from the wireless communication part 19 of the other side's camera 1, then from halted state or holding state recovery, communicates with setting up between the other side's camera 1.
Here, assuming that wireless communication part 19 such as can realize the contactless wireless near field communication of the transmission speed of about about 100Mbps in the communication distance of about a few about cm.According to the wireless near field communication realized by this wireless communication part 19, the other side's camera 1 can be sent to by instantaneous for all or part of of the data be recorded in digital camera 1 (image file 30).But be not limited only to this, suitably can apply the communicator of the communication standard that can realize communication distance and the transmission speed expected to form wireless communication part 19.
Control part 20 is by realizations such as CPU, it is according to the operation signal etc. from operating portion 14, reading from recording unit 17 and perform camera program, carrying out the instruction in each portion and the transmission etc. of data to forming digital camera 1, thus the action of logarithmic code camera 1 is carried out unifying to control.This control part 20 suitably comprises the dma controller etc. that CPU and the DMA for controlling between each portion transmits.Further, control part 20 comprises photography character image registration process portion 201, counterpart device data and obtains handling part 202, communication history blotter handling part 203, data transmit-receive handling part 204 and face/equipment alignment processing portion 205.
When generating photographs after indicating right moment for camera, be when having detected face at the face detection result carried out this view data, photography character image registration process portion 201 carries out the process this photographs being registered as photography character image.
Counterpart device data obtain handling part 202 and carry out following process: to the other side's camera 1 announcement apparatus data sending request, obtain this device data (being recorded in the device data in recording unit in the other side's camera 1) from the other side's camera 1.
Communication history blotter handling part 203 generates communication historical data, and using these data as communication history ephemeral data 181 blotter in blotter portion 18.In the present embodiment, the communication historical data 360 be set in the communication history list 36 of image file 30 is generated by the digital camera 1 of the side sending image file 30, when meeting transmission condition described later, the communication history ephemeral data 181 of blotter is appended in communication history list 36 as communication historical data 360 by this communication history blotter handling part 203.Specifically, communication history blotter handling part 203 using the device data of native data 171 that is recorded in recording unit 17 as transmission source device data.In addition, communication history blotter handling part 203 will be obtained the device data of the other side's camera 1 that handling part 202 obtains from the other side's camera 1 as transmission target equipment data by counterpart device data.Further, communication history blotter handling part 203 carries out following process: obtain current time from clock portion 16 and be used as the transmitting-receiving time, generates the communication historical data corresponding with them, and is recorded in blotter portion 18 as communication history ephemeral data 181.
Data transmit-receive handling part 204 carries out following process: send image file 30 as sending object via wireless communication part 19 to the other side's camera 1, and receives the image file sent from the other side's camera 1.This data transmit-receive handling part 204 comprises transmission unit 241, communication history list update portion 242, image file transmission processing portion 243 and the machine face detection unit 244.
Before send the image file 30 as sending object to the other side's camera 1, transmission unit 241 judges whether the communication history ephemeral data 181 being carried out recording processing by communication history blotter handling part 203 in blotter portion 18 meets transmission condition described later.
When transmission unit 241 is judged to be satisfied transmission condition, communication history list update portion 242 carries out following process: be newly appended to by communication history ephemeral data 181 as in the communication history list 36 of the image file 30 of sending object, then upgrade.
When transmission unit 241 is judged to be satisfied transmission condition, face location request flag is attached to as in the image file 30 of sending object by image file transmission processing portion 243, and carry out transmission processing to the other side's camera 1, whether this face location request flag is registered as photography character image according to the image file 30 as sending object and sets " ON " or " OFF ".
When receiving the image file that is attached with face location request flag " ON " from the other side's camera 1 and identify the face of local user shown in this view data by person recognition portion 121, the machine face detection unit 244 carries out sending to the other side's camera 1 the face location data that this face location is mapped with image ID.
After 243 pairs, image file transmission processing portion image file 30 has carried out transmission processing, receiving face location data from the other side's camera 1, face/equipment alignment processing portion 205 determines the corresponding face numbering in respective image data according to received face location data.Then, face/equipment alignment processing portion 205 with determined face number corresponding image send indicate in set " ON ", upgrade photography character image send information 175.Then, the personage corresponding to face that face/equipment alignment processing portion 205 is registered as determined face numbering is the owner carrying out this communication and have sent the other side's camera 1 of image file 30.That is, the device data of the other side's camera 1 that the face area (face image data) being assigned with determined face numbering in correspondence image data 33 obtains acquired by handling part 202 with counterpart device data by face/equipment alignment processing portion 205 is mapped and is set in face/device correspondence table 177.
Fig. 8 and Fig. 9 A, Fig. 9 B, Fig. 9 C are digital camera 1 and other digital camera 1(the other side camera 1 that present embodiment is described) between the skeleton diagram of communication.Here, exemplified with the rear situation showing the photography character image of personage to the transmission of other digital camera 1 of just taking in photograph mode, but the situation sending photography character image in reproduction mode also carries out same process.
Such as, as shown in Figure 8, the local user supposing to hold digital camera 1 makes digital camera 1 enter photograph mode, takes collective's photo to personage Pa, Pb, Pc such as friend.In the case, the photography character image I 1 display part 15 of digital camera 1 obtained shooting personage Pa, Pb, Pc carries out picture display.The external equipment (being digital camera 1 in the present embodiment) that local user as the digital camera 1 that this has been collective's photograph taking makes digital camera 1 hold with personage Pa, Pb, Pc communicates, and sends the collective's photo (photography character image) that have taken personage Pa, Pb, Pc successively.Here, suppose that other digital camera 1 that the personage Pa of Fig. 8 holds is the digital camera 1-2a shown in Fig. 9 A.In addition, suppose that other digital camera 1 that the personage Pb of Fig. 8 holds is the digital camera 1-2b shown in Fig. 9 B.Further, suppose that other digital camera 1 that the personage Pc of Fig. 8 holds is the digital camera 1-2c shown in Fig. 9 C.
First, as shown in Figure 9 A, local user other digital camera 1-2a that digital camera 1 and personage Pa are held is close to communicating.Thus, digital camera 1 for the other side's camera 1-2a, sends the image file 30(a1 of photography character image with other digital camera 1-2a).In the present embodiment, when receiving image file, digital camera 1 identifies the face of the local user shown in this view data, and the face location data identified are sent to the other side's camera 1.That is, when the other side's camera 1-2a identifies the face of local user in the view data of the image file received, the data of this face location are sent to digital camera 1(a3).Then, when receiving face location data, the personage that digital camera 1 is registered as face location received in view data is the owner carrying out this communication and have sent the other side's camera 1-2a of image file 30.In addition now, this face is set to and sends face, to face addition of the photography character image I 11 sending mark M11 and carry out picture display sending on display part 15.And here, mark exemplified with the transmission of dotted line face area, but be not limited to this, show as long as identifiable design is face as follows, that is: this face is the external equipment (other digital camera 1) of not yet determining that people corresponding to this face holds and does not send view data (image file 30).
Equally, as shown in Figure 9 B, local user other digital camera 1-2b that digital camera 1 and personage Pb are held is close to communicating.Thus, digital camera 1 for the other side's camera 1-2b, sends the image file 30(a5 of photography character image with other digital camera 1-2b).Then, when receiving the data of face location from the other side's camera 1-2b (a7), the personage of the face location received by digital camera 1 is registered as in view data is the owner of the other side's camera 1-2b.In addition now, this face is set to and sends face, to face addition of the photography character image I 13 sending mark M11, M13 and carry out picture display sending on display part 15.
Further, as shown in Figure 9 C, local user other digital camera 1-2c that digital camera 1 and personage Pc are held is close to communicating.Thus, digital camera 1 for the other side's camera 1-2c, sends the image file 30(a9 of photography character image with other digital camera 1-2c).Then, when receiving the data of face location from the other side's camera 1-2c (a11), the personage of the face location received by digital camera 1 is registered as in view data is the owner of the other side's camera 1-2c.In addition now, this face is set to and sends face, to face addition of the photography character image I 15 sending mark M11, M13, M15 and carry out picture display sending on display part 15.
Then, the processing sequence that digital camera 1 carries out is described.Figure 10 is the flow chart that the basic handling order that digital camera 1 carries out is shown.When power supply is connected, digital camera 1 carries out the process corresponding with the pattern selected by user operation.That is, as shown in Figure 10, when the pattern of current selection is photograph mode (step b1: yes), photograph mode process (step b3) is transferred to.At the end of photograph mode process, transfer to step b9.Current pattern be not photograph mode (step b1: no) but reproduction mode time (step b5: yes), transfer to reproduction mode process (step b7).At the end of reproduction mode process, transfer to step b9.Then, in step b9, judge whether basic handling terminates.Such as, during power-off, terminating present treatment (step b9: yes) by the operation instruction of push-button switch 7.On the other hand, at the end of not (step b9: no), step b1 is turned back to.
Here, successively the photograph mode process of step b3 and the reproduction mode process of step b7 are described.First, photograph mode process is described.Figure 11 is the flow chart of the detailed processing sequence that photograph mode process is shown.
As shown in figure 11, in photograph mode process, first, control part 20 starts (opening) wireless communication part 19(step c1).Then, control part 20 carries out following process: start image pickup part 11 to be taken into image, be taken into image is presented at (step c2) on display part 15 as live view image.By in this process, upgrade the shot object image and live view image that are presented at imaging on imaging apparatus according to every 1 frame.
Then, if press release-push 5 to indicate shooting (step c3: yes), then carry out photograph processing, generate the view data (step c5) of photographs.Then, face detection portion 13 performs face detection, from the view data of photographs, detect face area, detects each face portion (step c6) in face area.When extracting face feature information when the face detected by process here in view data, face's numbering is distributed to the face detected by control part 20, generates and detects facial information.Then, control part 20 is handled as follows: image ID is distributed to generated photographs, with the view data after image procossing, its thumbnail image data, detects facial information, current time etc. and is mapped and is recorded in recording unit 17 (step c7) as image file 30.Further, the state of communication historical data 360(sky is not set here in the communication history list 36 of the image file 30 recorded).In addition now, when detecting face in step c6 from the view data of photographs (step c8: yes), photography character image registration process portion 201 carries out the process (step c9) generated photographs being registered as photography character image.Specifically, image transmission, according to the image ID of generated photographs and the face's numbering distributing to the face detected, is masked as the record addition of " OFF " in photography character image information by photography character image registration process portion 201.
In following step c10, judging the transmission instruction of user, when have input transmission instruction, such as, the image file 30 of the up-to-date photographs generated in the photograph processing of step c5 being sent to as sending object the other side's camera 1 establishing communication.
That is, when have input transmission instruction (step c10: yes), in following step c11, the connection setup between this other side's camera 1 is monitored.Now, wireless communication part 19 starts to send electric wave, when the wireless communication part of the other side's camera 1 receives this electric wave, sets up the communication between the other side's camera 1.
And such as, until have passed through the stipulated time preset when not setting up communication yet (step c11: no), transfer to step c13, control part 20 carries out warning notice process.Then, step c25 is transferred to.Such as, in step c13, control part 20 carries out on display part 15, showing the process representing and do not set up the message of communication, notifier processes by way of caution.Further, the process notifier processes by way of caution that never illustrated loud speaker exports warning tones can also be carried out, or these treatment combinations are carried out.
On the other hand, when establishing communication between the other side's camera 1 (step c11: yes), communication history blotter process (step c15) is transferred to.Figure 12 is the flow chart of the detailed processing sequence that the process of communication history blotter is shown.
As shown in figure 12, in the process of communication history blotter, first, counterpart device data obtain handling part 202 and carry out via the process (steps d 1) of wireless communication part 19 to the other side's camera 1 announcement apparatus data sending request.Now, counterpart device data obtain handling part 202 and carry out device data being sent request the process sending to the other side's camera 1 together with the device data of the native data 171 be recorded in recording unit 17.
Then, counterpart device data obtain handling part 202 becomes holding state, till receiving device data via wireless communication part 19 from the other side's camera 1, from the other side's camera 1 receiving equipment data (steps d 3).Then, when receiving successfully (steps d 5: yes), steps d 7 is transferred to.Then, communication history blotter handling part 203 obtains current time (steps d 7) from clock portion 16.Then, the device data of the native data 171 be recorded in recording unit 17, the device data of the other side's camera 1 obtained in steps d 3 and the current time that obtains in steps d 7 are mapped generation communication historical data by communication history blotter handling part 203, and be recorded in blotter portion 18 as communication history ephemeral data 181, carry out blotter (steps d 9).In addition, communication history is generated mark and is set to " ON(success) " (steps d 11) by communication history blotter handling part 203.Then, turn back to the step c15 of Figure 11, then transfer to step c17.
On the other hand, when taking defeat in the steps d 5 at Figure 12 (steps d 5: no), communication history blotter handling part 203 carries out warning notice process (steps d 13).Such as, communication history blotter handling part 203 carry out showing on display part 15 represent fail to receive from the other side's camera 1 device data, do not generate the process of the message of communication historical data.As situation about taking defeat, include, for example out at the digital camera 1 of midway the machine elongated with the distance between the other side's camera 1 and cause the situation etc. disconnected that communicates.In addition, can also be the structure exporting warning tones, or form the structure of carrying out these process in combination.In addition, communication history is generated mark and is set to " OFF(failure) " (steps d 15) by communication history blotter handling part 203.Then, turn back to the step c15 of Figure 11, then transfer to step c17.
Then, in step c17, data transmit-receive handling part 204 judges whether generate communication historical data in the communication history blotter process of step c15.Such as, if it is " OFF " that communication history generates mark, then data transmit-receive handling part 204 is judged to not generate communication historical data (step c17: no), and transfer to step c25, and do not carry out data sending processing (not being sent in the photograph processing of step c5 the image file 30 of the up-to-date photographs generated to the other side's camera 1).
On the other hand, when communication history generation mark is " ON ", data transmit-receive handling part 204 is judged to generate communication historical data (step c17: yes), and carries out data sending processing (step c19).Figure 13 is the flow chart of the detailed processing sequence that data sending processing is shown.
As shown in figure 13, in data sending processing, first transfer to and send determination processing (step e1).Figure 14 is the flow chart that the detailed processing sequence sending determination processing is shown.
As shown in figure 14, in transmission determination processing, first transmission unit 241 is with reference to the communication history list 36 of the image file 30 as sending object.Then, transmission unit 241 is retrieved transmission target equipment data (that is, in the steps d 3 of Figure 12 from the device data that the other side's camera 1 of communication counterpart obtains) (the step f1) of communication history ephemeral data 181 from the communication historical data 360 set.
Then, transmission unit 241, according to this result for retrieval, determines whether satisfied transmission condition.Namely, when the transmission target equipment data of communication history ephemeral data 181 and the transmission source device data 361 of arbitrary communication historical data 360 or send target equipment data 363 consistent and from the transmitting-receiving time 365 of this communication historical data 360 without the specified time limit preset (such as, " 6 months " etc.) time, transmission unit 241 is judged to not meet transmission condition.When have passed through specified time limit when the transmitting-receiving time 365 from consistent communication historical data 360, be judged to be satisfied transmission condition.In addition, when the transmission target device of communication history ephemeral data 181 and the transmission source device data 361 of arbitrary communication historical data 360 and send target equipment data 363 all inconsistent time, be also judged to be satisfied transmission condition.
This is because following situation may be there is, that is: when the transmission target equipment data (sending the other side's camera 1 of the communication counterpart of target as this) of communication history ephemeral data 181 was once transmission source or the transmission target of this image file 30 in the past, can think that this image file 30 is recorded in the recording unit of the other side's camera 1, on the other hand, the user of the other side's camera 1 also may delete this image file 30.Therefore, in the present embodiment, when have passed through certain degree time (specified time limit) after receiving and dispatching this image file 30, again send.
Specifically, as shown in figure 14, when the transmission target equipment data of communication history ephemeral data 181 is consistent with the transmission source device data 361 of the communication historical data 360 set (step f3: yes), transmission unit 241 continues the transmitting-receiving time 365(step reading this communication historical data 360: f5).Then, the transmitting-receiving time 365 read compares with the transmitting-receiving time (current time) of communication history ephemeral data 181 by transmission unit 241, determines whether to have passed through specified time limit.If without specified time limit (step f7: no), then transfer to step f15.
On the other hand, when have passed through specified time limit (step f3: no) or transmitting-receiving time 365 of reading from step f5 when the transmission target equipment data of communication history ephemeral data 181 is inconsistent with the transmission source device data 361 of the communication historical data 360 to have set (step f7: yes), whether the transmission target equipment data that transmission unit 241 judges communication history ephemeral data 181 is consistent with the transmission target equipment data 363 of the communication historical data 360 set.Further, when consistent, (step f9: yes) reads the transmitting-receiving time 365(step f11 of this communication historical data 360 similarly).Then, the read-out transmitting-receiving time 365 compares with the transmitting-receiving time (current time) of communication history ephemeral data 181 by transmission unit 241, determines whether to have passed through specified time limit.If without specified time limit (step f13: no), then transfer to step f15.
Then, in step f15, transmission unit 241 is judged to not meet transmission condition, then carries out warning notice process (step f17).Such as, transmission unit 241 carries out the process that the image file 30 be shown as sending object of display list on display part 15 has been recorded in the message in the other side's camera 1.Further, can also be the structure exporting warning tones, or form the structure of carrying out these process in combination.In addition now, the communication history ephemeral data 181 of deletion record in blotter portion 18.Then, turn back to the step e1 of Figure 13, transfer to step e3.
On the other hand, when have passed through specified time limit (step f9: no) or transmitting-receiving time 365 of reading from step f11 when the transmission target equipment data of communication history ephemeral data 181 is inconsistent with the transmission source device data 361 of the communication historical data 360 to have set (step f13: yes), transmission unit 241 is judged to be satisfied transmission condition (step f19).Then, the communication history ephemeral data 181 of blotter in blotter portion 18 is appended in communication history list 36 by communication history list update portion 242, upgrades the image file 30(step f21 as sending object).In addition now, the communication history ephemeral data 181 in blotter portion 18 is deleted.Then, turn back to the step e1 of Figure 13, transfer to step e3.
Then, in step e3, data transmit-receive handling part 204, according to the result of the transmission determination processing of step e1, determines whether to send image file 30.That is, when being judged to not meet transmission condition in transmission determination processing, data transmit-receive handling part 204 is judged to not send image file 30(step e3: no), turn back to the step c19 of Figure 11, transfer to step c25.
On the other hand, when being judged to be satisfied transmission condition, data transmit-receive handling part 204 is judged to send image file 30(step e3: yes), then, image file transmission processing portion 243 determines whether the image file 30 being registered as photography character image.Specifically, when sending in information 175 at character image of photographing the image ID31 be set with as the image file 30 of sending object, image file transmission processing portion 243 is judged to be registered.Then, when registered (step e4: yes), image file transmission processing portion 243 will be used for asking the face location request flag " ON " sending face location data to be attached in the image file 30 of sending object and send to the other side's camera 1(step e5).When unregistered (step e4: no), image file transmission processing portion 243 will not ask the face location request flag " OFF " sending face location data to be attached in the image file 30 of sending object and send to the other side's camera 1(step e6).Then, data transmit-receive handling part 204 is in holding state, till receiving reception end notification.
Such as until when have passed through the reception end notification that the stipulated time preset do not receive from the other side's camera 1 yet (step e7: no), data transmit-receive handling part 204 is judged to be the transmission failure of image file 30, carries out warning notice process (step e9).Identical with situation about taking defeat, send failed situation and can list at the digital camera 1 of midway the machine elongated with the distance between the other side's camera 1 and cause the situation etc. disconnected that communicates.Data transmit-receive handling part 204 carries out on display part 15, show the process representing and send image file 30 failure to the other side's camera 1.Further, can also be the structure exporting warning tones, or form the structure of carrying out these process in combination.
In addition, the communication historical data 360 that this adds in the step f21 of Figure 14 is deleted in communication history list update portion 242 from the communication history list 36 of the image file 30 as sending object, more new image file 30(step e11).This is because this image file 30 does not send to the other side's camera 1, and transmission processing will be carried out when next have input send to the transmission of same the other side's camera 1 to indicate for the image file 30 as this sending object.Then, turn back to the step c19 of Figure 11, transfer to step c25.
On the other hand, when receiving the reception end notification from the other side's camera 1 (step e7: yes), further wait-receiving mode is from the face location data of the other side's camera 1.Such as until have passed through the stipulated time preset when receiving face location data from the other side's camera 1 not yet (step e13: no), turn back to the step c19 of Figure 11, transfer to step c25.
Then, when receiving face location data (step e13: yes), transfer to face/equipment alignment processing (step e15).The other side's camera 1 to have carried out being identified as in the view data 33 of the image file 30 of transmission processing the face (that is, the face of the user of the other side's camera 1) of local user positional informations in step e5 in these face location data received.Figure 15 is the flow chart of the detailed processing sequence that face/equipment alignment processing is shown.
As shown in figure 15, in face/equipment alignment processing, first, face/equipment alignment processing portion 205 is according to the image ID set in received face location data, with reference to corresponding image file 30, from the face data 350 detecting facial information 35, select the face data 350 of corresponding face location, determine face's numbering (step g 1).Specifically, received face location data and the face location detecting the face data 350 set in facial information 35 compare by face/equipment alignment processing portion 205, select the face data 350 of consistent face location, determine that face numbers.Then, corresponding image, according to the image ID of received face location data and determined face numbering, is sent mark and is set to " ON " by face/equipment alignment processing portion 205, upgrades photography character image and sends information 175(step g 3).
In addition, personage corresponding to face that face/equipment alignment processing portion 205 is registered as determined face numbering is the owner's (step g 5) carrying out this communication and have sent the other side's camera 1 of image file 30.Specifically, now, under the control in face/equipment alignment processing portion 205, image processing part 12 carries out cutting the view data 33 corresponding from the image ID31 with corresponding the process of the face area that the face that determines in step g 1 numbers.The device data that the view data (face image data) that image processing part 12 cuts by face/equipment alignment processing portion 205 and the other side's camera 1 from communication counterpart in the steps d 3 of Figure 12 receive is mapped and is set in face/device correspondence table 177.
Then, control part 20 carries out for adding to the face that sends in this view data 33 process (step g 7) sending mark and carry out confirming to show.By in this process, control part 20 as Graphics Processing portion, so that the mode sending face and do not send face in recognition image data Graphics Processing can be carried out.Specifically, first, control part 20 sends as the image corresponding with the image ID of the face location data received the face that mark sets " ON " in information 175 number according to sending at photography character image, namely in step g 3, image sends mark and is configured to face's numbering of " ON " and sends as the image corresponding with this image ID the face that mark is configured to " ON " and number, from corresponding face data 350, read face location.Then, face-image handling part 123 is according to read-out facial positions, and near this facial zone, position synthesis sends mark.Then, control part 20 shows the view data after synthesizing process on display part 15.And, although carried out synthesizing process to sending mark, also near corresponding face area, can send mark in position overlap display (OSD:On ScreenDisplay, screen display) here.
In addition, in fig. 11, when not inputting shooting instruction in step c3 (step c3: no), control part 20 determines whether to receive the data from the other side's camera 1.Such as, when establishing communication with the other side's camera 1, control part 20 is judged to receive.If can not receive (step c21: no), then transfer to step c25.Then, when being judged to receive (step c21: yes), data transmit-receive handling part 204 carries out data receiver process (step c23).Figure 16 is the flow chart of the detailed processing sequence that data receiver process is shown.
As shown in figure 16, in data receiver process, be first in holding state, until receive from establish communication the other side's camera 1 notify device data send request till.Then, such as, if received device data from the other side's camera 1 to send request (step h1: yes) before the stipulated time through presetting, then data transmit-receive handling part 204 is handled as follows: via wireless communication part 19, the device data (device id of the machine) of the native data 171 be recorded in recording unit 17 is sent to the other side's camera 1(step h3).Then, when sending successfully (step h5: yes), data transmit-receive handling part 204 continues to receive image file (step h7) from the other side's camera 1, if received successfully (step h9: yes), then carries out the process (step h11) to the other side's camera 1 transmission and reception end notification.
Then, the machine face detection unit 244 judges the face location request flag of the image file received.If there is no be attached with the image file (step h13: no) of the face location request flag of " ON ", then transfer to step h27.
On the other hand, when existence is attached with the image file of the face location request flag of " ON " (step h13: yes), the image file of the face location request flag being attached with this " ON " is set to handling object (step h15) by the machine face detection unit 244.Then, the process (step h17 ~ step h25) of circulation A is carried out for each image file as handling object.By process herein, when receiving multiple image file of the face location request flag being attached with " ON " in step h7, carry out the process of circulation A for this image file each.
Namely, in circulation A, first facial information detects from as reading the image file of handling object in person recognition portion 121, the machine face data of the face data set in read-out detection facial information and native data 171 is carried out checking and carries out person recognition process (step h19).When setting multiple face data as detection facial information, each face data is checked with the machine face data respectively.Then, if there is the face data high with the similarity of the machine face data, then its face area is identified as the face of local user by person recognition portion 121, and exports face's numbering as recognition result.
Then, when person recognition portion 121 identifies the face of local user (step h21: yes), the machine face detection unit 244 reads image ID from as the image file of handling object, and numbers corresponding face location data (step h23) to the other side's camera 1 transmission with face.If carried out the process of circulation A to the image file of whole handling object, then transfer to step h27.
Then, in step h27, the image file received in step h7 is recorded in recording unit 17 as image file 30 by data transmit-receive handling part 204, and is appended to (step h27) in image information 173.Then, turn back to the step c23 of Figure 11, and be transferred to step c25.
On the other hand, when not receiving device data in the step h1 at Figure 16 and sending request (step h1: no), (step h5: no) when sending unsuccessfully in step h5, when taking defeat in step h9 (step h9: no), transfer to step h29.Then, in step h29, data transmit-receive handling part 204 carries out warning notice process.Such as, data transmit-receive handling part 204 carries out showing on display part 15 process representing the message that the image file 30 from the other side's camera 1 takes defeat.Further, can also be the structure exporting warning tones, or carry out the structure of these process in combination.Then, turn back to the step c23 of Figure 11, and be transferred to step c25.
Then, in the step c25 of Figure 11, the end that control part 20 carries out photograph mode judges, when finished (step c25: yes), transfers to step c27.When not terminating photograph mode (step c25: no), turn back to step c1.
Then, in step c27, control part 20 makes wireless communication part 19 become halted state (closedown).Further, if the now communication of wireless communication part 19 is not in performing, then do not process especially.Then, terminate photograph mode process, turn back to the step b3 of Figure 10, and be transferred to step b9.
Then, reproduction mode process is described.Figure 17 is the flow chart of the detailed processing sequence that reproduction mode process is shown.As shown in figure 17, in reproduction mode process, first, (opening) wireless communication part 19(step I 1 that control part 20 starts).Then, control part 20 selects the image (reproduced image) (step I 2) that will reproduce from the image be recorded in as image file 30 in recording unit 17.Here the image that carries out is selected can be such as the structure selecting the image be recorded in recording unit 17 according to record order successively seriatim, or carries out reading the thumbnail image data of multiple image and have a guide look of the control of display and from guide look, select the structure of 1 image according to user operation.Then, image reproducing process (step I 3) is transferred to.Figure 18 is the flow chart of the detailed processing sequence that image reproducing process is shown.
As shown in figure 18, in image reproducing process, first, control part 20 judges whether the reproduced image selected in the step I 2 of Figure 17 is the photographs being registered as photography character image.Specifically, when being set with the image ID 31 of selected image file 30 in photography character image transmission information 175, control part 20 is judged to be registered.If unregistered (step j1: yes), then transfer to step j9.On the other hand, when registered (step j1: yes), control part 20 determines whether the face's numbering being configured to " ON " as corresponding image transmission mark.If do not have image transmission to be masked as face's numbering (step j3: no) of " ON ", then transfer to step j9.
Further, in step j9, control part 20 carries out reading in the view data 33 of reproduced image and the process of reproduction (display) on display part 15.Then, turn back to the step I 3 of Figure 17, and be transferred to step I 5.In addition, take and be recorded in except the data in recording unit 17 except the machine, the view data of the image file 30 received from the other side's camera 1 in the data receiver process of Figure 16 is also included within the view data of reproduced image reproduced here.
On the other hand, when there is (step j3: yes) when image transmission is masked as face's numbering of " ON ", control part 20, with reference to detection facial information 35, sends the face's numbering being masked as " ON ", reads face location (step j5) from corresponding face data 350 according to image.Then, control part 20, according to read-out face location, carries out for adding to the face that sends in this view data 33 process (step j7) sending mark and reproduction display.Specifically, first, the face location that face image handling part 123 reads according to control part in step j5 20, near its face area, position synthesis sends mark.Then, control part 20 shows the view data after synthesizing process on display part 15.Then, turn back to the step I 3 of Figure 17, and be transferred to step I 5.
Then, in step I 5, receive the switching instruction of reproduced image, when have input switching instruction (step I 5: yes), control part 20 turns back to step I 2, again selects reproduced image, and carries out image reproducing process (step I 3).
In addition, in this reproduction mode process, in step I 7, judging the transmission instruction of user, when have input transmission instruction, such as, the image file 30 of the image be recorded in recording unit 17 being sent to the other side's camera 1 of built vertical communication.When not inputting transmission instruction (step I 7: no), transfer to step I 25.
Then, when have input transmission instruction (step I 7: yes), first, control part 20 judges that the transmission inputted indicates.The digital camera 1 of present embodiment is configured to, as the transmission instruction in reproduction mode, can be such as unified send all images be recorded in recording unit 17 transmission instruction, the transmission instruction sending the image be recorded in recording unit 17 one by one and to select according to user operation and the transmission sending the multiple images be recorded in recording unit 17 indicates.Control part 20 selects the image file 30(step I 9 as sending object according to inputted transmission instruction).
Then, in following step i11, monitor and the foundation communicated between this other side's camera 1.Now, wireless communication part 19 starts to send electric wave, when the wireless communication part of the other side's camera 1 receives this electric wave, and the connection setup between the other side's camera 1.
Then, such as, if until have passed through stipulated time of presetting not yet with set up between the other side's camera 1 communicate (step I 11: no), then control part 20 carries out warning notice process (step I 13).And after this, transfer to step I 25.Such as, control part 20 notifier processes by way of caution, display part 15 shows the message representing and do not set up communication.Further, can also be the structure exporting warning tones, or carry out the structure of these process in combination.
On the other hand, when establishing the communication between the other side's camera 1 (step I 11: yes), transfer to communication history blotter process (step I 15).This communication history blotter process is carried out in the same manner as the processing sequence shown in Figure 12.
Then, in following step i17, data transmit-receive handling part 204 judges whether generate communication historical data in the communication history blotter process of step I 15.Specifically, identical with the step c17 of Figure 11, if the result of the communication history blotter process as step I 15, set communication history mark is " OFF ", then data transmit-receive handling part 204 is judged to not generate communication historical data (step I 17: no).In the case, transfer to step I 25, and do not carry out data sending processing (not being sent in the image file 30 as sending object selected according to the transmission instruction of step I 7 in step I 9 to the other side's camera 1).
Then, when communication history generation is masked as " ON ", data transmit-receive handling part 204 is judged to generate communication historical data (step I 17: yes), and continues the number judging the image file 30 as sending object selected in step I 9.Then, if the image file 30 as sending object selected in step I 9 is 1 (when indicating transmission 1 image in step I 7) (step I 19: no), data transmit-receive handling part 204 carries out data sending processing (step I 21).This data sending processing is carried out in the same manner as the processing sequence shown in Figure 13.
On the other hand, when the image file 30 as sending object is multiple (when indicating unified transmission image in step I 7, or when selecting multiple image to carry out transmission instruction) (step I 19: yes), data transmit-receive handling part 204 carries out data and unifies transmission processing (step I 23).Figure 19 illustrates that data unify the flow chart of the detailed processing sequence of transmission processing.
As shown in figure 19, unify in transmission processing in data, first, successively multiple image files 30 of sending object are carried out the process (step k1 ~ step k5) of circulation B as handling object.That is, transmission determination processing (step k3) is carried out for the image file 30 of each sending object respectively.Carry out this transmission determination processing identically with the processing sequence shown in Figure 14.If carried out the process (namely sending determination processing) of circulation B to the image file 30 of whole sending object, then transfer to step k7.
Then, in step k7, data transmit-receive handling part 204, according to the result of the transmission determination processing for whole sending object image file 30 of step k1 ~ step k5, determines whether the image file 30 that will send.That is, in step k1 ~ step k5, when being all judged to not meet transmission condition for multiple sending object image file 30, data transmit-receive handling part 204 is judged to be the image file 30(step k7 that will not send: no).Then, turn back to the step I 23 of Figure 17, and be transferred to step I 25.
On the other hand, if there is the image file 30 meeting transmission condition, then data transmit-receive handling part 204 is judged to be to there is the image file 30(step k7 that will send: yes).In the case, face location request flag " ON " is added in the image file 30 that will send the image file 30 being registered as photography character image, the image file 30(step k8 being namely set with its image ID 31 in photography character image transmission information 175 by image file transmission processing portion 243).In addition, face location request flag " OFF " is added to the unregistered image file 30 for photography character image in the image file 30 that will send, namely sends at photography character image the image file 30(step k9 not setting its image ID 31 in information 175).Then, image file transmission processing portion 243 carries out following process: the image file 30 that will send via wireless communication part 19 is unified sends to the other side's camera 1(step k10).Then, data transmit-receive handling part 204 is in holding state, till receiving reception end notification.
Such as, if until have passed through the stipulated time preset not receive reception end notification (step k11: no) from the other side's camera 1 yet, then data transmit-receive handling part 204 is judged to be that image file 30 sends failure, carries out warning notice process (step k13).Such as, identical with the step e9 of Figure 13, data transmit-receive handling part 204 carries out on display part 15, show the process representing and send image file 30 failure to the other side's camera 1.Further, can also be the structure exporting warning tones, or carry out the structure of these process in combination.
In addition, the communication historical data 360 that this adds in the transmission determination processing of step k3 is deleted in communication history list update portion 242 from the communication history list 36 of the image file 30 as sending object, more new image file 30(step k15).When having added communication historical data 360 in multiple image file 30, from each image file 30, delete the communication historical data 360 that this adds respectively.Then, turn back to the step I 23 of Figure 17, and be transferred to step I 25.
On the other hand, when receiving the reception end notification from the other side's camera 1 (step k11: yes), further wait-receiving mode is from the face location data of the other side's camera 1.Such as, when until have passed through the stipulated time preset receive face location data from the other side's camera 1 not yet (step k17: no), turn back to the step I 23 of Figure 17, and be transferred to step I 25.
Then, when receiving face location data (step k17: yes), carry out the process (step k19 ~ step k23) of circulation C for received each face location data.That is, face/equipment alignment processing (step k21) is carried out respectively for each face location data.This face/equipment alignment processing is carried out in the same manner as the processing sequence shown in Figure 15.If carried out the process (i.e. face/equipment alignment processing) of circulation C to received whole face location data, then turn back to the step I 23 of Figure 17, and be transferred to step I 25.
Then, in step I 25, the end that control part 20 carries out reproduction mode judges, (step I 25: yes) transfers to step I 27 when finished.When non-end of reproduction pattern, (step I 25: no) turns back to step I 5.
Then, in step I 27, control part 20 makes wireless communication part 19 become halted state (closedown).And now, if the communication of wireless communication part 19 is not in performing, then do not process especially.Then, end of reproduction mode treatment, turns back to the step b7 of Figure 10, and is transferred to step b9.
As mentioned above, according to the present embodiment, when receiving face location data from the other side's camera 1, the face in sent photography character image can be determined according to received face location data at the image file 30 that have sent the photography character image showing character facial to the other side's camera 1.Further, personage corresponding for determined face can be registered as the owner of the other side's camera 1.Here, face location packet is identified as the face location of the face (i.e. the face of the user of the other side's camera 1) of local user in this view data containing the other side's camera 1 receiving image file 30.
In addition, can when confirming display or reproduce the view data showing photography character image, carry out send face (face of the external equipment (other digital camera 1) in the face in view data 33, having sent to people corresponding to this face to hold this view data (being actually image file 30)) near position synthesis represent the process sending mark that sent, and carry out Graphics Processing on display part 15.
On the other hand, can when receiving the image file that is attached with face location request flag " ON " from the other side's camera 1 and being identified the face of the local user shown in this view data by person recognition portion 121, send to the other side's camera 1 the face location data that this face location is mapped with image ID.
Therefore, user can be confirmed whether that other camera 1 held to the personage that the face in view data is corresponding have sent this view data.Thus, the effect that can reduce and repeat the waste sending identical content data to same external equipment is served.
And, in upper execution mode, describe following situation, in photography character image in the face that shows, the external equipment held to this people have sent this view data send position near face, add and sent mark to identify that display sends face, but be not limited to this.
Figure 20 is the figure of the display frame example of the photography character image illustrated in variation 1.In variation 1, as shown in figure 20, when confirming display or reproduce display photography character image, display frame is divided into 2 parts.The image display frame W21 of upside shows photography character image.On the other hand, the thumbnail display frame W23 of downside shows thumbnail image S21, S23 of face, and shown face is in face shown in being presented on image display frame W21 photography character image, have sent the face of its view data.As the inter-process of this situation, such as first, during control part 20 face that reading is corresponding with the image ID of photography character image from photography character image transmission information 175 numbers, image sends the face being masked as " ON " and numbers, from corresponding face data 350, read face location.Then, image processing part 12, according to the face location read, carries out the process cutting face area from view data 33 here.Then, control part 20 carries out the process of breviary display face area on thumbnail pictures W23.
Further, following structure can be adopted, that is: when counterpart device data obtain handling part 202 obtain device data from the other side's camera 1 time, obtain owner's data such as the possessory name holding this other side's camera 1 in the lump with device data.Or also can adopt following structure, that is: the owner's data also recording the machine are used as native data 171, send request, owner's data are sent to the other side's camera 1 together with device data in response to the device data notified from the other side's camera 1.In the case, when breviary display has sent the face area of face on thumbnail display frame W23, the process of text display owner information has been carried out in the lump.
In addition, in variation 1, on thumbnail display frame W23, breviary display sends face.On the other hand, also the face do not sent can be shown by breviary.Figure 21 is the figure of the display frame example of the photography character image illustrated in variation 2.In variation 2, as shown in figure 21, when confirming display or reproduce display photography character image, drawing a portrait identical with the display shown in Figure 20, display frame being divided into 2 parts.The image display frame W31 of upside shows photography character image.On the other hand, the thumbnail display frame 33 of downside shows the thumbnail image S31 of face, the face of this display is presented in face shown in the photography character image in image display frame 31, not send its view data face.As the inter-process of this situation, such as first, during control part 20 face that reading is corresponding with the image ID of photography character image from photography character image transmission information 175 numbers, image sends the face being masked as " OFF " and numbers, from corresponding face data 350, read face location.Then, image processing part 12, according to the face location read, carries out the process cutting face area from view data 33 here.Then, control part 20 carries out the process of breviary display face area on thumbnail pictures W33.
Also can be in addition, when not receiving face location data from the other side's camera 1, maybe by the image file 30 of image not containing face beyond photography character image sending to the other side's camera 1, when being set with the device data obtained from this other side's camera 1 in face/device correspondence table 177, corresponding face image data can be pointed out, have sent the possessory face of the transmission target external device of image file 30 as this.
Figure 22 is the flow chart of the detailed processing sequence of the data sending processing illustrated in variation 3.Further, in fig. 22 same label is marked to the treatment step identical with above-mentioned execution mode.In variation 3, as shown in figure 22, when not receiving face location data from the other side's camera 1 (step e13: no), control part 20 continues the device data judging whether to be set with the other side's camera 1 in face/device correspondence table 177.Here, the device data being set with the other side's camera 1 in face/device correspondence table 177 represents that past attempts sends photography character image to the other side's camera 1, receive face location data in response to this from the other side's camera 1, in digital camera 1, register the possessory face of the other side's camera 1.Therefore, when being set with the device data of the other side's camera 1 in face/device correspondence table 177 (step l17: yes), control part 20 carries out following process: confirm to show (step l19) together with sent view data 33 on display part 15 by with the face image data that the device data of the other side's camera 1 is set in face/device correspondence table 177 accordingly.On the other hand, when not being set with the device data of the other side's camera 1 in face/device correspondence table 177 (step l17: no), control part 20 carries out the process (step l21) confirming to show the view data 33 sent on display part 15.
The figure of display frame example when Figure 23 is the confirmation display that the view data 33 of carrying out in step l19 is shown.In variation 3, as shown in figure 23, confirm display this send to the image of the other side's camera 1 time, in the same manner as the display frame illustrated in Figure 20 and Figure 21, display frame is divided into 2 parts.The image display frame W41 of upside shows sent image.In fig 23, display photographs the image of the landscape not containing personage.On the other hand, in the thumbnail display frame 43 of downside, namely display holds the thumbnail S41(of the possessory face of the other side's camera 1 sending target, is set in the face image data in face/device correspondence table 177 with the device data of the other side's camera 1 accordingly).
Like this, according to this variation 3, as long as register face image data as the owner of external equipment (other digital camera 1) in face/device correspondence table 177, then communicating using this digital camera 1 as the other side's camera 1 subsequently, when sending image file 30, the face of the user of the other side's camera 1 can be shown by picture.
In addition, in the above-described embodiment, be illustrated with the situation of other digital camera 1 for communication counterpart transmitting-receiving image file 30, but the external equipment as communication counterpart is not particularly limited.Figure 24 illustrates digital camera 1 with personal computer (notebook) 90 for communication counterpart carries out the figure of situation about communicating.Here, personal computer 90 be built-in with can and the wireless communication part 19 of digital camera 1 between carry out the wireless communication part 901 that communicates.This wireless communication part 901 is configured to its antenna and is positioned at side, keyboard face.Further, when digital camera 1 and personal computer 90 approach in the region that wireless communication part 19 and wireless communication part 901 can communicate mutually, between separately, communication is set up.In the situation being communication counterpart using external equipments such as personal computers like this (situation as sending target), as long as this external-device response carries out the data transmit-receive process of Figure 16 in the notice that the device data from digital camera 1 sends request.
In addition, in the above-described embodiment, as an example of portable equipment of the present invention, be illustrated for digital camera, but be also applicable to camera-enabled mobile phone, game machine, music player, recording device, notebook etc. can with other portable equipment of external equipment transceiving data.
According to the present invention, from the external equipment of the view data of the subject sent containing face, the information that user face that is that comprise in sent view data and external equipment is relevant can be obtained.Further, according to the information relevant to acquired face, the display of display part can be upgraded.Therefore, in the face that user can confirm in view data, have sent the face of the user of the external equipment of view data.Thus, the effect that can reduce and repeat the waste sending identical content data to same external equipment is served.

Claims (4)

1. a portable equipment, is characterized in that, described portable equipment has:
Display part, it shows the view data generated by image pickup part;
Department of Communication Force, communicates between its with external equipment;
Recording unit, it records the table device data obtained from described external equipment and the face image data corresponding with this device data are mapped; And
Graphics Processing portion, even if send described view data to described external equipment, described Department of Communication Force does not also receive face location data from described external equipment, when being registered with the described device data of the described external equipment having sent described view data in the table, described Graphics Processing portion makes described display part show the described face image data corresponding with described device data.
2. portable equipment according to claim 1, is characterized in that,
The face data stored in described table is corresponding with the thumbnail image of face,
Described Graphics Processing portion, in the display frame of described display part, shows the view data of described transmission and described thumbnail image.
3. portable equipment according to claim 1, is characterized in that,
Described portable equipment also has face/equipment alignment processing portion, it carries out alignment processing to described table, make when described Department of Communication Force receives face location data from described external equipment, by corresponding with described device data for the face location data of described reception.
4. portable equipment according to claim 1, is characterized in that,
The photography character image of described Graphics Processing portion also for sending from described external equipment according to described Department of Communication Force sends information, the thumbnail image carrying out showing the face of the user successfully sending described view data at described display part or the process of the thumbnail image of the face of user do not sent.
CN201210244282.0A 2009-03-24 2010-03-11 Portable equipment Expired - Fee Related CN102769733B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2009-072415 2009-03-24
JP2009072415A JP5331532B2 (en) 2009-03-24 2009-03-24 Portable device
CN2010101216024A CN101848324B (en) 2009-03-24 2010-03-11 Portable device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN2010101216024A Division CN101848324B (en) 2009-03-24 2010-03-11 Portable device

Publications (2)

Publication Number Publication Date
CN102769733A CN102769733A (en) 2012-11-07
CN102769733B true CN102769733B (en) 2015-07-08

Family

ID=42772767

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201210244282.0A Expired - Fee Related CN102769733B (en) 2009-03-24 2010-03-11 Portable equipment
CN201210244285.4A Expired - Fee Related CN102769734B (en) 2009-03-24 2010-03-11 Portable equipment
CN2010101216024A Expired - Fee Related CN101848324B (en) 2009-03-24 2010-03-11 Portable device

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201210244285.4A Expired - Fee Related CN102769734B (en) 2009-03-24 2010-03-11 Portable equipment
CN2010101216024A Expired - Fee Related CN101848324B (en) 2009-03-24 2010-03-11 Portable device

Country Status (2)

Country Link
JP (1) JP5331532B2 (en)
CN (3) CN102769733B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5769434B2 (en) * 2011-02-03 2015-08-26 キヤノン株式会社 Movie recording device, information processing device
JP5717459B2 (en) * 2011-02-04 2015-05-13 キヤノン株式会社 Image recording apparatus, information processing apparatus, control method thereof, and program thereof
CN102982465B (en) * 2011-07-05 2016-08-03 宏达国际电子股份有限公司 Wireless service provider method
CN103591894B (en) * 2013-11-05 2017-07-11 广东欧珀移动通信有限公司 The method and apparatus of object length is measured by camera

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1574957A (en) * 2003-05-29 2005-02-02 卡西欧计算机株式会社 Photographed image transmitting apparatus
CN1836439A (en) * 2003-09-01 2006-09-20 松下电器产业株式会社 Camera having transmission function, mobile telephone device, and image data acquiring/transmitting program
CN101010941A (en) * 2004-09-01 2007-08-01 株式会社尼康 Electronic camera system, phtographing ordering device and phtographing system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004201191A (en) * 2002-12-20 2004-07-15 Nec Corp Image processing and transmitting system, cellular phone, and method and program for image processing and transmission
JP4522344B2 (en) * 2004-11-09 2010-08-11 キヤノン株式会社 Imaging apparatus, control method thereof, and program thereof
JP2006293912A (en) * 2005-04-14 2006-10-26 Toshiba Corp Information display system, information display method and portable terminal device
JP4315148B2 (en) * 2005-11-25 2009-08-19 株式会社ニコン Electronic camera
JP5273998B2 (en) * 2007-12-07 2013-08-28 キヤノン株式会社 Imaging apparatus, control method thereof, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1574957A (en) * 2003-05-29 2005-02-02 卡西欧计算机株式会社 Photographed image transmitting apparatus
CN1836439A (en) * 2003-09-01 2006-09-20 松下电器产业株式会社 Camera having transmission function, mobile telephone device, and image data acquiring/transmitting program
CN101010941A (en) * 2004-09-01 2007-08-01 株式会社尼康 Electronic camera system, phtographing ordering device and phtographing system

Also Published As

Publication number Publication date
CN102769733A (en) 2012-11-07
JP2010226498A (en) 2010-10-07
CN101848324B (en) 2012-09-05
CN102769734B (en) 2015-04-15
CN102769734A (en) 2012-11-07
JP5331532B2 (en) 2013-10-30
CN101848324A (en) 2010-09-29

Similar Documents

Publication Publication Date Title
CN100393107C (en) Electronic device and information processing apparatus and control method thereof,
JP4081541B2 (en) Imaging communication system
CN101753809A (en) Portable device
CN102077581A (en) Data receiving apparatus, data transmitting apparatus, method for controlling the same and program
JP2004201191A (en) Image processing and transmitting system, cellular phone, and method and program for image processing and transmission
JP2002191079A (en) Communication unit and method, image pickup device and method, data communication system, data retrieval method, program, and storage medium
US10021361B2 (en) Image processing device, imaging device, image processing method, and program
JP2005267146A (en) Method and device for creating email by means of image recognition function
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
JP4123473B2 (en) Mobile phone
CN102769733B (en) Portable equipment
CN106023083A (en) Method and device for obtaining combined image
EP4060603A1 (en) Image processing method and related apparatus
CN107426088A (en) Pictorial information processing method and processing device
CN115567630A (en) Management method of electronic equipment, electronic equipment and readable storage medium
CN114363678A (en) Screen projection method and equipment
CN115686182B (en) Processing method of augmented reality video and electronic equipment
JP2003250079A (en) Imaging apparatus, imaging system, and method for controlling imaging operation
JP5246592B2 (en) Information processing terminal, information processing method, and information processing program
JP2002094857A (en) Camera, system and method for acquiring imaging information
CN115705663B (en) Image processing method and electronic equipment
CN107450867B (en) Photo printing method and device
KR101167665B1 (en) Method and apparatus for sorting photo in portable terminal
JP2023089743A (en) Image processing apparatus and method for controlling the same, imaging apparatus, and program
JP2001103359A (en) Communication unit, image pickup device, communication system, communication method and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20151207

Address after: Tokyo, Japan

Patentee after: OLYMPUS Corp.

Address before: Tokyo, Japan

Patentee before: Olympus Imaging Corp.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211208

Address after: Tokyo, Japan

Patentee after: Aozhixin Digital Technology Co.,Ltd.

Address before: Tokyo, Japan

Patentee before: OLYMPUS Corp.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150708