CN101848324B - Portable device - Google Patents

Portable device Download PDF

Info

Publication number
CN101848324B
CN101848324B CN2010101216024A CN201010121602A CN101848324B CN 101848324 B CN101848324 B CN 101848324B CN 2010101216024 A CN2010101216024 A CN 2010101216024A CN 201010121602 A CN201010121602 A CN 201010121602A CN 101848324 B CN101848324 B CN 101848324B
Authority
CN
China
Prior art keywords
face
data
image
camera
processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2010101216024A
Other languages
Chinese (zh)
Other versions
CN101848324A (en
Inventor
尾方利广
关川雄介
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Imaging Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Imaging Corp filed Critical Olympus Imaging Corp
Priority to CN201210244282.0A priority Critical patent/CN102769733B/en
Publication of CN101848324A publication Critical patent/CN101848324A/en
Application granted granted Critical
Publication of CN101848324B publication Critical patent/CN101848324B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention provides a portable equipment which is characterized by including: an imaging portion (1) which shoots objects and generates image data; a display portion (15) which displays image data; a communication portion (19) which is communicated with an external equipment; a transmission processing portion (243) which transmits image data displayed on the display portion (15) via the communication portion (19) to the external equipment for processing; and a display processing portion (20) which acquires information stored in the external equipment and relative to the face portion via the communication portion (19) and updates display of the display portion (15) according to acquired information relative to the face portion.

Description

Portable equipment
Technical field
The present invention relates to send the portable equipment of content-data to external equipment.
Background technology
All the time, the captured view data of portable equipments such as mobile phone that will attach camera at large sends to external equipments such as other portable equipment or personal computer.In addition; In No. 4077924 communique of Japan Patent; A kind of image data management device is disclosed; This image data management device will send source information in advance and the view data that receives with external device communication is stored accordingly, and the view data that perhaps will send target information and send to this external equipment with external device communication is stored accordingly.In the image data management device of No. 4077924 communique of this Japan Patent, according to the corresponding transmission source information of view data and send target information, the view data that deletion sends from external equipment or in the past once to the view data of external equipment transmission.
But imagination shows under many people's the situation of image taking collective's photo etc. with portable equipment, captured view data is sent to the portable equipment that the people held that shows in this image.Here, communicate under the situation of sending view data, can't know afterwards and send view data to whom at the portable equipment of only being held with this person.Therefore, the user must be remembered that the people who has sent to.In addition, also have because of forgetting the waste situation of retransmitting.
Summary of the invention
The characteristic of the portable equipment of the present invention's mode is that this portable equipment has: image pickup part, and it is taken the photograph body to the quilt that contains face and takes, and generates view data; Display part, it shows the said view data that is generated by said image pickup part; Communicate between the Department of Communication Force, itself and external equipment; Send handling part, it carries out will being presented at the processing that view data on the said display part sends to said external equipment via said Department of Communication Force; And demonstration handling part; It carries out following processing; Via said Department of Communication Force from said external equipment obtain with said view data user's the relevant information of face of the said external equipment that comprises, and the information relevant with said face that obtains according to this is upgraded the demonstration of said display part.
With reference to detailed description of the present invention below the advantages, thereby can further understand the meaning on foregoing and other purpose of the present invention, characteristic, advantage and technical and the industry.
Description of drawings
Fig. 1 is the approximate three-dimensional map that the front face side of the digital camera of having used portable equipment of the present invention is shown.
Fig. 2 is the approximate three-dimensional map that the rear side of digital camera is shown.
Fig. 3 is the figure that the situation that digital camera and other digital camera communicate is shown.
Fig. 4 be illustrate digital camera main internal structure one the example block diagram.
Fig. 5 is the figure that the data structure example of image file is shown.
Fig. 6 is the figure that the data structure example of communication history data is shown.
Fig. 7 illustrates the figure that the photography character image sends the data structure example of information.
Fig. 8 is the figure of the communication summary between the explanation digital camera.
Fig. 9 A is another figure of the communication summary between the explanation digital camera.
Fig. 9 B is another figure of the communication summary between the explanation digital camera.
Fig. 9 C is another figure of the communication summary between the explanation digital camera.
Figure 10 illustrates the basic handling flow chart in proper order that digital camera carries out.
Figure 11 illustrates the detailed process flow chart in proper order that photograph mode is handled.
Figure 12 illustrates the detailed process flow chart in proper order that the communication history blotter is handled.
Figure 13 is the flow chart that the detailed process order of data sending processing is shown.
Figure 14 is the flow chart that the detailed process order of sending determination processing is shown.
Figure 15 is the flow chart that the corresponding detailed process order of handling of face/equipment is shown.
Figure 16 illustrates the detailed process flow chart in proper order that Data Receiving is handled.
Figure 17 illustrates the detailed process flow chart in proper order that reproduction mode is handled.
Figure 18 illustrates the detailed process flow chart in proper order that the reconstruction of image is handled.
Figure 19 illustrates the flow chart that uniform data sends the detailed process order of handling.
Figure 20 is the figure that the display frame example of the photography character image in the variation 1 is shown.
Figure 21 is the figure that the display frame example of the photography character image in the variation 2 is shown.
Figure 22 is the flow chart that the detailed process order of the data sending processing in the variation 3 is shown.
Figure 23 is the figure of affirmation that the view data in the variation 3 the is shown display frame example when showing.
Figure 24 is the figure that the situation that digital camera and personal computer communicate is shown.
Embodiment
Below, with reference to accompanying drawing, at length preferred implementation of the present invention is described.And the present invention is not limited only to this execution mode.In addition, in the record of each accompanying drawing, to the identical identical label of part mark.
Fig. 1 is the approximate three-dimensional map that the front face side of the digital camera 1 of having used portable equipment of the present invention is shown, and Fig. 2 is the approximate three-dimensional map that the rear side of digital camera 1 is shown.As shown in Figure 1, for digital camera 1, phtographic lens 4 is arranged in the former configuration of camera body 3, have image pickup part 11 (with reference to Fig. 4) to taking through the shot object image of these phtographic lens 4 incidents in the inside of camera body 3.On the other hand, on camera body 3, dispose the release-push 5 that is used to indicate right moment for camera.And, at the back side of camera body 3, dispose the display part 15 that shows various pictures and be used to import various operations such as energized and model selection distribution a plurality of push-button switches 7 etc. of inherent function.
And when the power supply of connecting (ON) digital camera 1, when selecting photograph mode, but digital camera 1 becomes shooting state.In photograph mode, the shot object image through phtographic lens 4 incidents is presented on the display part 15 as the live view image in real time, and the user presses release-push 5 and takes rest image or dynamic image when watching the live view image.Perhaps the mode of operation with digital camera 1 switches to reproduction mode, and captured rest image or dynamic image demonstration (reproduction) are appreciated on display part 15.
In addition, digital camera 1 is as shown in Figure 1, is built-in with wireless communication part 19 in camera body 3 inner pre-positions, and it is used for carrying out wireless near field communication with external equipments such as other digital camera 1, personal computer, mobile phone, television sets.Appropriate position in camera body 3 inside disposes this wireless communication part 19, makes its antenna towards exterior side.In the example of Fig. 1, wireless communication part 19 is configured in a bight of camera body 3 bottom surface sides.Like this through wireless communication part 19 being configured in the corners of device case, the user when the communication of carrying out via wireless communication part 19, can easily carry out and communication partner device between the location, suppress the communication failure that causes by position deviation.
Fig. 3 is the figure that the situation that digital camera 1 and other digital camera 1 communicate is shown.In Fig. 3, with a side be made as digital camera 1, the opposing party is made as digital camera 1-2.As shown in Figure 3, in the time of in the zone that can intercom mutually to each wireless communication part 19, set up the communication between them when camera body 3 is closer to each other.Below, in this execution mode, illustrate coming the situation of transceive data as communication counterpart as other digital camera 1 of external equipment one example, abbreviate other digital camera 1 of this communication counterpart as " the other side's camera 1 ".
Fig. 4 be illustrate digital camera 1 main internal structure one the example block diagram.As shown in Figure 4, digital camera 1 has: image pickup part 11, image processing part 12, face detection portion 13, operating portion 14, display part 15, clock portion 16, recording portion 17, blotter portion 18, wireless communication part 19 and control part 20 etc.
Image pickup part 11 comprises the shot object image through phtographic lens 4 incidents is carried out opto-electronic conversion and exports CCD (the Charge Coupled Device of analog electrical signal; Charge coupled device) or CMOS (Complementary Metal Oxide Semiconductor; Complementary metal oxide semiconductors (CMOS)) imaging apparatus such as; To convert digital electric signal to from the analog electrical signal of imaging apparatus, output image data.
12 pairs of view data from image pickup part 11 outputs of image processing part are implemented various image processing, and convert the processing that is suitable for writing down usefulness or shows the view data of usefulness to.For example, when the view data of records photographing image or when showing institute's recorded image data, carry out handling or decompression processing based on the Image Data Compression of JPEG (JointPhotographic Experts Group, Joint Photographic Experts Group) mode etc.In addition, image processing part 12 view data of carrying out the microfilming image generates and shows with the processing of the view data (thumbnail image data) of little image (thumbnail image) or cut the processing etc. of the presumptive area in the view data.
This image processing part 12 comprises person recognition portion 121 and face image handling part 123.Person recognition portion 121 will check view data face detection result's (face feature information) who carries out and the face feature information of registering the user (this machine user) of (record) these machines in recording portion 17 as this machine data 171.And whether the face that person recognition portion 121 carries out in the recognition image data is the processing (person recognition processing) of this machine user's face.In this execution mode, the view data of the image file that 121 pairs in person recognition portion receives from the other side's camera 1 is carried out person recognition and is handled, and discerns this machine user's who shows in this view data face.Face image handling part 123 carries out following processing: when showing the photography character image; In the face in view data, to the corresponding external equipment that the people held (being other digital camera 1 in this execution mode) of this face sent this view data (being actually image file 30) face's (having sent face) near the position, the mark of transmission that synthetic expression has been sent.
The view data of 13 pairs of photographss of face detection portion carries out handling as the face detection of known technology, and the zone of the face that shows in the inspection image data (face area) extracts the characteristic information (face feature information) of face then.Here, it for example is the face area in the application model matching detection view data that face detection is handled, and according to the testing result of this face area, detects each face's parts such as eyes, nose, lip, eyebrow.And, for example, from the personage's that image showed face image, extract face feature information through calculating and relevant relative size or the distances of key element such as eye, nose, lip, eyebrow.Wherein because face is different with this relative distance when the side when the front, therefore according to the slant correction of face after the above-mentioned relative distance, extraction face feature information.In addition, can wait the inclination that detects face according to the shade of center, eye or the nose of face.At this moment, if consider the shape and the color of each key element, then can detect the inclination of face more accurately.When in view data, comprising a plurality of face, can obtain face feature information to each face.
Operating portion 14 be used to receive right moment for camera (taking indication) digital cameras 1 such as indication, photograph mode or reproduction mode mode of operation switching, send the users' such as setting of indication, photography conditions various operations for the data of the other side's camera 1, to control part 20 notifying operation signals.This operating portion 14 comprises the release-push 5 and push-button switch 7 of Fig. 1.
Display part 15 is except showing rest image, dynamic image and the live view image of taking; Also show the various set informations of digital camera 1 etc.; This display part is by LCD (LiquidCrystal Display; LCD) or EL display display unit such as (Electroluminescence Display, electroluminescent displaies) realize.This display part 15 for example redraws the live view image according to per 1 frame and dynamically shows continuously in photograph mode, in reproduction mode, show captured rest image or dynamic image.
Clock portion 16 is used for the date and carries out timing constantly.The date when confirming to take according to the current time of these clock portion 16 timing and the moment, the view data of photographs is recorded in the recording portion 17 with determined shooting time.
Recording portion 17 by various semiconductor memories such as the flash memory of renewable record or ROM, built-in or utilize recording mediums such as hard disk that data communication terminals connects, storage card with and read-write equipment realize, this recording portion can suitably adopt the tape deck corresponding with purposes or make up after use.Record the various camera programs that are used to make digital camera 1 work and realize the various functions that this digital camera 1 has in this recording portion 17; And the data of in this computer program is carried out, using etc.Here, the camera program comprise make control part 20 as after photography character image registration process portion 201, the other side's device data stated obtain handling part 202, communication history blotter handling part 203, data transmit-receive handling part 204, send detection unit 241, communication history list update portion 242, image file send the corresponding handling part 205 performance functional programs of handling part 243, this machine face detection unit 244 and face/equipment.
In addition, record this machine data 171 in the recording portion 17.This this machine data 171 comprises device data and this machine face data.Device data is used to discern this machine, for example can use the device id of when making digital camera 1, distributing to each equipment inherently.In addition, be not limited to device id, can also suitably the information that can discern digital camera 1 be used as device data.On the other hand, this machine face data is the face feature information that from the view data of the face that shows this machine user, extracts.This this machine face data is registered according to user's operation.When order was described, for example the face through this machine of actual photographed user perhaps carried out zone selection, the face image of specifying this machine user to this machine user's in the view data that shows this machine user face area.In response to this, face detection portion 13 at first carries out face detection to specified face image data and handles, and extracts face feature information.Then, control part 20 carries out recording processing with resulting face feature information as this machine face data in recording portion 17.
In addition, recording a plurality of image files 30 in the recording portion 17 is image information 173.In this execution mode; Utilizing image processing part 12 to write down with one example of the content-data after the image processing is the view data of photographs; Image file 30 as the document form that has added file header is recorded in the recording portion 17; This document head for example comprises EXIF (Exchangeable Image File, exchangeable image file) information such as shooting time, the gps data that obtains when taking, title, annotations and comments, personal information.
Fig. 5 is the figure that the data structure example of image file 30 is shown.As shown in Figure 5, image file 30 comprises the view data 33 of photographic informations (EXIF information) 32 such as image I D 31, shooting time, photographs, thumbnail image data 34, detection face's information 35 and the communication history tabulation 36 of photographs.Image I D 31 is identifying informations that for example (during shooting) distributes inherently when generating this view data in order to discern each image.And the image file of the photographs of taking except this machine, each image file 30 of composing images information 173 also can suitably comprise via wireless communication part 19 communicates by letter with other digital camera 1 and the image file obtained from other digital camera 1.
Detecting face's information 35 is the results that this view data 33 carried out face detection, comprises the face data 350 of detected each face.For example, when in view data 33, detecting 3 faces and extracting face feature information, will be set at face data 350 respectively to 3 face feature information of each face.Particularly, in face data 350, the face feature information that will go out to this face extraction corresponding with the face's numbering that is used to discern this face set.Here, face feature information comprises face location and the size in this view data 33.In addition, face's numbering for example is the serial number that detected face in this view data 33 is distributed.And, in the image file 30 that in view data 33, does not detect face, do not set the face's information 35 that detects.
Communication history tabulation 36 is the historical informations of up to now receiving and dispatching at (in this execution mode, between digital camera shown in Figure 31 and other digital camera 1-2) between the equipment to this image file 30.In this execution mode, when transmitting-receiving image file 30, generate communication history data 360, the communication history data 360 that generated are appended in the communication history tabulation 36.For example, if be set with 3 communication history data 360 in the communication history tabulation 36, then between the regulation digital camera 1 that each communication history data 360 is represented respectively, received and dispatched this image file 30 three times.On the other hand, when not being provided with communication history data 360 in the communication history tabulation 36, between digital camera 1, do not received and dispatched this image file 30.
Fig. 6 is the figure that the data structure example of communication history data 360 is shown.As shown in Figure 6, communication history data 360 comprise sends source device data 361, transmission target device data 363, transmitting-receiving time 365.In sending source device data 361, be set with the device data that sends a side (sending the digital camera 1 in source) of image file 30 in the transmitting-receiving of the image file 30 when generating these communication history data 360.On the other hand, in sending target device data 363, be set with the device data that receives a side (sending the digital camera 1 of target) of image file 30 in the transmitting-receiving of this image file 30 when generating these communication history data 360.Current time when in addition, in the transmitting-receiving time 365, being set with these communication history data 360 of generation.
And, as shown in Figure 4, in recording portion 17, record the photography character image and send information 175 and face/device correspondence table 177.
It is to be used for whether once this machine photographs (detecting the photographs of face in its view data: be designated hereinafter simply as " photography character image ") that take, that show face being sent to the tables of data that external equipment (being other digital camera 1) is managed in this execution mode that the photography character image sends information 175.In more detail; Send in the information 175 to each photography character image at the photography character image, whether write down once other digital camera 1 that this photographs is sent to the registered equipment of holding for the people of face's correspondence of being shown in this photography character image.
Fig. 7 illustrates the figure that the photography character image sends the data structure example of information 175.As shown in Figure 7, it is that face's numbering and image transmission are indicated the tables of data that the image I D of the photography character image of taking with this machine sets accordingly that the photography character image sends information 175.In face's numbering, be set with face's numbering that detected face distributes in the view data to respective image ID.It is whether to represent that other digital camera 1 that the pairing people of face to corresponding face numbering holds has sent the flag information of the image file 30 of correspondence image ID that image sends sign.Under the situation of having sent, set " ON (being) ", under the situation of not sending, set " OFF (denying) ".For example, in the view data 33 of the image file 30 that " ID00011 " is set at image I D 31, detect 3 faces that are assigned face numbering " 01 "~" 03 ".And; As write down shown in the R11; The image of face's numbering " 01 " therein sends and sets " ON " in the sign, and image I D 31 has been sent out other digital camera 1 of holding to the corresponding people of the face of face's numbering " 01 " for the image file 30 of " ID00011 ".On the other hand, send at the image of face numbering " 03 " and to set " OFF " (record R13) in the sign, this image file 30 does not also send to other digital camera 1 that the corresponding people of face of face's numbering " 03 " holds.
Face/device correspondence table 177 be set through from the view data of photography character image, cut face image data that face area obtains, with the device data of obtaining from other digital camera 1 that is registered as the corresponding equipment that the people held of this face between the tables of data of corresponding relation.
Blotter portion 18 for example is made up of semiconductor memories such as RAM, uses memory as the work of control part 20.This blotter portion 18 has the storage area that temporarily program of control part 20 execution and the data of in this program is carried out, using etc. is kept.For example, blotter communication history ephemeral data 181.This communication history ephemeral data 181 has the data structure identical with communication history data shown in Figure 6 360, and these communication history data 360 comprise sends the source device data, sends target device data and transmitting-receiving time.In addition; Blotter portion 18 is used for carrying out blotter etc. from the view data of image pickup part 11 output, the view data of the view data of the image (live view image) that these blotter portion 18 blotters are taken into from image pickup part 11 to per 1 frame and the image (photographs) that is taken into from image pickup part 11 at right moment for camera etc.
Wireless communication part 19 be used for and the other side's camera 1 between carry out wireless near field communication; Constitute by transmission circuit etc.; Receive and dispatch electric wave signal between the wireless communication part 19 of this transmission circuit through antenna 191 and the other side's camera 1; Carry out demodulation process to received signal, and carry out modulation treatment sending signal.This wireless communication part 19 regularly sends the signal of communication of notifying its existence when startup, and detects the signal of communication of seeing off from the wireless communication part 19 of the other side's camera 1, recovers from halted state or holding state then, communicates by letter with foundation between the other side's camera 1.
Here, suppose that wireless communication part 19 for example can realize the contactless wireless near field communication of the transmission speed about about 100Mbps in the communication distance about about several cm.According to the wireless near field communication of realizing by this wireless communication part 19, can be with all or part of the instantaneous the other side's of the sending to camera 1 that is recorded in the data (image file 30) in the digital camera 1.But be not limited only to this, the communicator that can suitably use the communication standard of the communication distance that can realize expecting and transmission speed constitutes wireless communication part 19.
Control part 20 is by realizations such as CPU; It is according to from the operation signal of operating portion 14 etc.; From recording portion 17, read and carry out the camera program, carry out the indication of each one of constituting digital camera 1 and the transmission of data etc., thereby control is unified in the action of logarithmic code camera 1.This control part 20 suitably comprises CPU and is used to control dma controller that the DMA between each one transmits etc.And control part 20 comprises that photography character image registration process portion 201, the other side's device data obtain the corresponding handling part 205 of handling part 202, communication history blotter handling part 203, data transmit-receive handling part 204 and face/equipment.
When having generated photographs behind the indication right moment for camera, for to detect under the situation of face, photography character image registration process portion 201 carries out this photographs is registered as the processing of photography character image in the face detection result that this view data is carried out.
The other side's device data is obtained handling part 202 and is carried out following processing: to the other side's camera 1 announcement apparatus data sending request, from the other side's camera 1, obtain this device data (in the other side's camera 1, being recorded in the device data in the recording portion).
Communication history blotter handling part 203 generates the communication history data, and with these data as communication history ephemeral data 181 blotters in blotter portion 18.In this execution mode; The digital camera 1 that is set in communication history data 360 in the communication history tabulation 36 of image file 30 and is by a side of sending image file 30 generates; During the transmission condition after satisfying, stated, this communication history blotter handling part 203 is appended to the communication history ephemeral data 181 of blotter in the communication history tabulation 36 as communication history data 360.Particularly, communication history blotter handling part 203 will be recorded in this machine data 171 in the recording portion 17 device data as sending the source device data.In addition, communication history blotter handling part 203 will obtain the other side's camera 1 that handling part 202 obtains from the other side's camera 1 by the other side's device data device data as sending the target device data.And communication history blotter handling part 203 carries out following processing: obtain current time from clock portion 16 and be used as the transmitting-receiving time, generate the communication history data corresponding with them, and record in the blotter portion 18 as communication history ephemeral data 181.
Data transmit-receive handling part 204 carries out following processing: send as the image file 30 that sends object to the other side's camera 1 via wireless communication part 19, and receive the image file that sends from the other side's camera 1.This data transmit-receive handling part 204 comprises that transmission detection unit 241, communication history list update portion 242, image file send handling part 243 and this machine face detection unit 244.
Before sending as the image file 30 that sends object, send detection unit 241 and judge the transmission condition of stating after whether the communication history ephemeral data 181 that in blotter portion 18, carries out recording processing by communication history blotter handling part 203 satisfies to the other side's camera 1.
Be judged to be when satisfying the transmission condition when sending detection unit 241, communication history list update portion 242 carries out following processing: communication history ephemeral data 181 newly is appended to as the communication history of the image file 30 that sends object tabulates in 36, upgrade then.
Be judged to be when satisfying the transmission condition when sending detection unit 241; Image file sends handling part 243 the face location request flag is appended to as in the image file 30 that sends object; And send processing to the other side's camera 1, this face location request flag has been set " ON " or " OFF " according to whether be registered as the photography character image as the image file 30 that sends object.
Additional when the image file of face location request flag " ON " being arranged and identifying this machine user's who is shown this view data face by person recognition portion 121 when receiving from the other side's camera 1, this machine face detection unit 244 carries out sending the face location data that this face location and image I D are mapped to the other side's camera 1.
After 243 pairs of image files of image file transmission handling part 30 have carried out sending processing; Receive under the face location data conditions from the other side's camera 1, the corresponding handling part 205 of face/equipment is confirmed the corresponding face numbering in the respective image data according to received face location data.Then, the corresponding handling part 205 of face/equipment upgrades the character image transmission information 175 of photographing numbering setting " ON " in the corresponding image transmission sign with determined face.Then, the corresponding handling part 205 of face/equipment is registered as the pairing personage of face of determined face numbering and carries out this communication and the owner that sent the other side's camera 1 of image file 30.That is, the corresponding handling part 205 of face/equipment device data that the face area (face image data) that has distributed determined face numbering in the correspondence image data 33 and the other side's device data are obtained the other side's camera 1 that handling part 202 obtained is mapped and is set in face/device correspondence table 177.
Fig. 8 and Fig. 9 A, Fig. 9 B, Fig. 9 C are the skeleton diagrams of communicating by letter between digital camera 1 and other digital camera 1 (the other side's camera 1) of this execution mode of explanation.Here, illustration the situation of after just taking, sending the photography character image that shows the personage in the photograph mode to other digital camera 1, but the situation of in reproduction mode, sending the photography character image is also carried out same processing.
For example, as shown in Figure 8, this machine user who supposes to hold digital camera 1 makes digital camera 1 get into photograph mode, to for example taking collective's photo as friend's personage Pa, Pb, Pc.In the case, the photography character image I 1 that on the display part 15 of digital camera 1, obtains taking personage Pa, Pb, Pc carries out picture and shows.The external equipment (being digital camera 1 in this execution mode) that this machine user who has so carried out the digital camera 1 of collective's photograph taking holds digital camera 1 and personage Pa, Pb, Pc communicates, and sends collective's photo (photography character image) of having taken personage Pa, Pb, Pc successively.Here, suppose that other digital camera 1 that the personage Pa of Fig. 8 holds is the digital camera 1-2a shown in Fig. 9 A.In addition, suppose that other digital camera 1 that the personage Pb of Fig. 8 holds is the digital camera 1-2b shown in Fig. 9 B.And, suppose that other digital camera 1 that the personage Pc of Fig. 8 holds is the digital camera 1-2c shown in Fig. 9 C.
At first, shown in Fig. 9 A, other digital camera 1-2a that this machine user holds digital camera 1 and personage Pa is approaching to be communicated.Thus, digital camera 1 is the other side's camera 1-2a with other digital camera 1-2a, sends the image file 30 (a1) of photography character image.In this execution mode, when receiving image file, this machine user's who shows in digital camera 1 this view data of identification face sends to the other side's camera 1 with the face location data that identify.That is, when identifying this machine user's face in the view data of the image file that the other side's camera 1-2a is receiving, the data of this face location are sent to digital camera 1 (a3).Then, when receiving the face location data, the personage that digital camera 1 is registered as face location received in the view data carries out this communication and the owner that sent the other side's camera 1-2a of image file 30.In addition this moment, this face is made as sends face, on display part 15 to having added the photography character image I 11 that sends mark M11 in the face and carry out picture and show sending.And here; Illustration with dashed lines surround the mark of transmission of face area; But be not limited to this; Get final product so long as can be identified as following such face's demonstration, that is: this face confirms the external equipment that the people held (other digital camera 1) that this face is corresponding as yet and does not send view data (image file 30).
Equally, shown in Fig. 9 B, other digital camera 1-2b that this machine user holds digital camera 1 and personage Pb is approaching to be communicated.Thus, digital camera 1 is the other side's camera 1-2b with other digital camera 1-2b, sends the image file 30 (a5) of photography character image.Then, when receiving the data of face location from the other side's camera 1-2b (a7), the personage that digital camera 1 is registered as face location received in the view data is the owner of the other side's camera 1-2b.In addition this moment, this face is made as sends face, on display part 15 to having added the photography character image I 13 that sends mark M11, M13 in the face and carry out picture and show sending.
And shown in Fig. 9 C, other digital camera 1-2c that this machine user holds digital camera 1 and personage Pc is approaching to be communicated.Thus, digital camera 1 is the other side's camera 1-2c with other digital camera 1-2c, sends the image file 30 (a9) of photography character image.Then, when receiving the data of face location from the other side's camera 1-2c (a11), the personage that digital camera 1 is registered as face location received in the view data is the owner of the other side's camera 1-2c.In addition this moment, this face is made as sends face, on display part 15 to having added the photography character image I 15 that sends mark M11, M13, M15 in the face and carry out picture and show sending.
Then, the processing sequence that digital camera 1 carries out is described.Figure 10 illustrates the basic handling flow chart in proper order that digital camera 1 carries out.When power connection, digital camera 1 carries out and the pattern corresponding processing of selecting through user's operation.That is, shown in figure 10, when the pattern of current selection is photograph mode (step b1: be), transfers to photograph mode and handle (step b3).When the photograph mode processing finishes, transfer to step b9., transfer to reproduction mode and handle (step b7) when not being photograph mode (step b 1: not) but reproduction mode (step b5: be) in current pattern.When the reproduction mode processing finishes, transfer to step b9.Then, in step b9, judge whether basic handling finishes.For example, when having indicated power-off, finish this processing (step b9: be) through the operation of push-button switch 7.On the other hand, (step b9: not), turn back to step b1 when not finishing.
Here, successively the photograph mode processing of step b3 and the reproduction mode processing of step b7 are described.At first, processing describes to photograph mode.Figure 11 illustrates the detailed process flow chart in proper order that photograph mode is handled.
Shown in figure 11, in photograph mode was handled, at first, control part 20 started (opening) wireless communication part 19 (step c1).Then, control part 20 carries out following processing: start image pickup part 11 and be taken into image, the image that is taken into is presented at (step c2) on the display part 15 as the live view image.Through the processing at this, the shot object image that comes update displayed on imaging apparatus, to form images according to per 1 frame is the live view image.
Then, indicate shooting (step c3: be), then carry out photograph processing, generate the view data (step c5) of photographs if press release-push 5.Then, face detection portion 13 carries out face detection, from the view data of photographs, detects face area, detects each face's part (step c6) in the face area.When extracting face feature information when publishing picture through the processing and detecting here as the face in the data, control part 20 is distributed to detected face with face's numbering, generates the face's information that detects.Then; Control part 20 is handled as follows: D distributes to the photographs that is generated with image I, with view data after the image processing, its thumbnail image data, detect face's information, current time etc. and be mapped and record in the recording portion 17 (step c7) as image file 30.And, do not set communication history data 360 (empty state) in the communication history of images recorded file 30 tabulation here 36.In addition this moment, when in step c6, from the view data of photographs, detecting face (step c8: be), photography character image registration process portion 201 carries out the photographs that is generated is registered as the processing (step c9) of photography character image.Particularly, photography character image registration process portion 201 is according to the image I D and face's numbering of distributing to detected face of the photographs that is generated, and the record addition that the image transmission is masked as " OFF " is in photography character image information.
In following step c10, judge user's transmission indication, when having imported the transmission indication, the image file 30 of the up-to-date photographs that for example will in the photograph processing of step c5, generate sends to the other side's camera 1 of having set up communication as sending object.
That is, when having imported (step c10: be) when sending indication, in following step c11, keep watch on this other side's camera 1 between the foundation of communicating by letter.At this moment, wireless communication part 19 begins to send electric wave, when the wireless communication part of the other side's camera 1 receives this electric wave, set up with the other side's camera 1 between communicate by letter.
And for example, do not set up yet when communication until having passed through the predefined stipulated time that (step c11: not), transfer to step c13, control part 20 carries out warning notice to be handled.Then, transfer to step c25.For example, in step c13, control part 20 carries out the processing that data representing is not set up the message of communication on display part 15, notifier processes by way of caution.And the processing that can also carry out never illustrated loud speaker output warning tones is notifier processes by way of caution, or these treatment combination ground are carried out.
On the other hand, when with the other side's camera 1 between set up (step c11: be) when communicating by letter, transfer to the communication history blotter and handle (step c15).Figure 12 illustrates the detailed process flow chart in proper order that the communication history blotter is handled.
Shown in figure 12, in the communication history blotter was handled, at first, the other side's device data was obtained handling part 202 and is carried out via the processing (steps d 1) of wireless communication part 19 to the other side's camera 1 announcement apparatus data sending request.At this moment, the other side's device data is obtained the processing that handling part 202 carries out the device data of device data transmission request this machine data 171 in being recorded in recording portion 17 is sent to the other side's camera 1.
Then, the other side's device data is obtained handling part 202 becomes holding state, till receiving device data via wireless communication part 19 from the other side's camera 1, from the other side's camera 1 receiving equipment data (steps d 3).Then, (steps d 5: be) transferred to steps d 7 when receiving successfully.Then, communication history blotter handling part 203 is obtained current time (steps d 7) from clock portion 16.Then; Communication history blotter handling part 203 will be recorded in the device data of the device data of this machine data 171 in the recording portion 17, the other side's camera 1 of in steps d 3, obtaining and the current time in steps d 7, obtained is mapped generates the communication history data; And be recorded in the blotter portion 18 as communication history ephemeral data 181, carry out blotter (steps d 9).In addition, communication history blotter handling part 203 generates sign with communication history and sets " ON (success) " (steps d 11) for.Then, turn back to the step c15 of Figure 11, transfer to step c17 again.
On the other hand, (steps d 5: not), communication history blotter handling part 203 carries out warning notice and handles (steps d 13) when taking defeat in the steps d 5 at Figure 12.For example, communication history blotter handling part 203 carries out the processing that on display part 15 data representing is failed to receive device data from the other side's camera 1, do not generated the message of communication history data.As situation about taking defeat, the digital camera 1 that for example can enumerate out this machine halfway and cause communicate by letter situation of breaking off etc. elongated with the distance between the other side's camera 1.In addition, can also be the structure of output warning tones, perhaps constitute the structure of carrying out these processing in combination.In addition, communication history blotter handling part 203 generates sign with communication history and sets " OFF (failure) " (steps d 15) for.Then, turn back to the step c15 of Figure 11, transfer to step c17 again.
Then, in step c17, data transmit-receive handling part 204 judges in the communication history blotter of step c15 is handled whether generated the communication history data.For example; If it is " OFF " that communication history generates sign; Then data transmit-receive handling part 204 is judged to be and does not generate communication history data (step c17: not); And transfer to step c25, and do not carry out data sending processing (not being sent in the image file 30 of the up-to-date photographs that generates in the photograph processing of step c5) to the other side's camera 1.
On the other hand, when communication history generation sign was " ON ", data transmit-receive handling part 204 was judged to be and has generated communication history data (step c17: be), and the line data of going forward side by side sends handles (step c19).Figure 13 is the flow chart that the detailed process order of data sending processing is shown.
Shown in figure 13, in data sending processing, at first transfer to and send determination processing (step e1).Figure 14 is the flow chart that the detailed process order of sending determination processing is shown.
Shown in figure 14, in sending determination processing, at first send detection unit 241 with reference to communication history tabulation 36 as the image file 30 that sends object.Then, send detection unit 241 is retrieved communication history ephemeral data 181 from the communication history data of having set 360 transmission target device data (that is the device data of, in the steps d 3 of Figure 12, obtaining) (step f1) from the other side's camera 1 of communication counterpart.
Then, send detection unit 241, judge whether satisfy the transmission condition according to this result for retrieval.Promptly; When the transmission source device data 361 of the transmission target device data of communication history ephemeral data 181 and communication history data 360 arbitrarily or to send target device data 363 consistent and (for example do not pass through predefined specified time limit from the transmitting-receiving time 365 of these communication history data 360; " 6 months " etc.) time, transmission detection unit 241 is judged to be and does not satisfy the transmission condition.When having passed through specified time limit, be judged to be and satisfy the transmission condition from the transmitting-receiving time 365 of the communication history data 360 of unanimity.In addition, when the transmission source device data 361 of the transmission target device of communication history ephemeral data 181 and communication history data 360 arbitrarily and send target device data 363 when all inconsistent, also be judged to be and satisfy the transmission condition.
This is because possibly there be following situation; That is: when the transmission target device data of communication history ephemeral data 181 (sending the other side's camera 1 of the communication counterpart of target) as this once be the transmission source of this image file 30 or when sending target in the past; Can think that this image file 30 is recorded in the recording portion of the other side's camera 1; On the other hand, the user of the other side's camera 1 also possibly delete this image file 30.Therefore, in this execution mode, when after this image file 30 of transmitting-receiving, having passed through certain degree time (specified time limit), send once more.
Particularly; Shown in figure 14; When the transmission source device data of the transmission target device data of communication history ephemeral data 181 and the communication history data of having set 360 361 are consistent (step f3: be), send the transmitting-receiving times 365 (step: f5) that detection unit 241 continues to read these communication history data 360.Then, whether the transmitting-receiving time 365 that transmission detection unit 241 will be read compared with the transmitting-receiving time (current time) of communication history ephemeral data 181, judge and passed through specified time limit.If do not pass through specified time limit (step f7: not), then transfer to step f15.
On the other hand; (step f3: not) or transmitting-receiving time 365 of from step f5, reading when having passed through specified time limit (step f7: be), do not send detection unit 241 and judge whether the transmission target device data of communication history ephemeral datas 181 are consistent with the transmission target device data 363 of the communication history data of having set 360 when the transmission source device data of the transmission target device data of communication history ephemeral data 181 and the communication history data of having set 360 361 are inconsistent.And (step f9: be) reads the transmitting-receiving time 365 (step f11) of these communication history data 360 similarly when unanimity.Then, send detection unit 241 the transmitting-receiving time (current time) of the transmitting-receiving time of being read 365 with communication history ephemeral data 181 is compared, judge and whether passed through specified time limit.If do not pass through specified time limit (step f13: not), then transfer to step f15.
Then, in step f15, transmission detection unit 241 is judged to be and does not satisfy the transmission condition, carries out warning notice then and handles (step f17).For example, send detection unit 241 and carry out on display part 15 data representing has been recorded in the message in the other side's camera 1 as the image file 30 that sends object processing.And, can also be the structure of output warning tones, perhaps constitute the structure of carrying out these processing in combination.In addition at this moment, the communication history ephemeral data 181 of deletion record in blotter portion 18.Then, turn back to the step e1 of Figure 13, transfer to step e3.
On the other hand; (step f9: not) or transmitting-receiving time 365 of from step f11, reading when having passed through specified time limit (step f13: be), do not send detection unit 241 and be judged to be and satisfy transmission condition (step f19) when the transmission source device data of the transmission target device data of communication history ephemeral data 181 and the communication history data of having set 360 361 are inconsistent.Then, communication history list update portion 242 is appended to the communication history ephemeral data 181 of blotter in blotter portion 18 in the communication history tabulation 36, upgrades as the image file 30 (step f21) that sends object.In addition this moment, delete the communication history ephemeral data 181 of blotter portion 18.Then, turn back to the step e1 of Figure 13, transfer to step e3.
Then, in step e3, data transmit-receive handling part 204 judges whether send image file 30 according to the result of the transmission determination processing of step e1.That is, in sending determination processing, be judged to be when not satisfying the transmission condition, data transmit-receive handling part 204 is judged to be and does not send image file 30 (step e3: not), turn back to the step c19 of Figure 11, transfer to step c25.
On the other hand, when being judged to be when satisfying the transmission condition, data transmit-receive handling part 204 is judged to be and sends image file 30 (step e3: be), and then, image file sends handling part 243 and judges whether be the image file 30 that is registered as the photography character image.Particularly, when sending the image I D 31 of the image file 30 that is set with conduct transmission object in the information 175 at the photography character image, image file sends handling part 243 and is judged to be registered.Then; When registered (step e4: be), image file sends face location request flag " ON " that handling part 243 will be used for asking sending the face location data and appends to as the image file 30 that sends object and send to the other side's camera 1 (step e5).When unregistered (step e4: not), image file sends handling part 243 and will not ask to send the face location request flag " OFF " of face location data and append to as in the image file 30 that sends object and send to the other side's camera 1 (step e6).Then, data transmit-receive handling part 204 is in holding state, till receiving the reception end notification.
For example (step e7: not), data transmit-receive handling part 204 is judged to be the transmission failure of image file 30, carries out warning notice and handles (step e9) when not receiving the reception end notification from the other side's camera 1 yet until having passed through the predefined stipulated time.Identical with situation about taking defeat, the situation of sending failure can be enumerated out the situation of the digital camera 1 of this machine halfway and cause communicate by letter disconnection elongated with the distance between the other side's camera 1 etc.Data transmit-receive handling part 204 carries out on display part 15 data representing is sent processing from image file 30 failures to the other side's camera 1.And, can also be the structure of output warning tones, perhaps constitute the structure of carrying out these processing in combination.
In addition, the communication history data 360 that this appends in the step f21 of Figure 14 are deleted, update image file 30 (step e11) by communication history list update portion 242 from the communication history tabulation 36 as the image file 30 that sends object.This is because this image file 30 does not send to the other side's camera 1, and next will send processing to having imported as this image file 30 that sends object when the transmission that sends to same the other side's camera 1 is indicated.Then, turn back to the step c19 of Figure 11, transfer to step c25.
On the other hand, when the reception end notification that receives from the other side's camera 1 (step e7: be), further wait for the face location data that receive from the other side's camera 1.For example until having passed through (step e13: not), turn back to the step c19 of Figure 11, transfer to step c25 when the other side's camera 1 receives the face location data not yet of predefined stipulated time.
Then, when receiving the face location data (step e13: be), transfer to corresponding handle (the step e15) of face/equipment.Positional information in the face location data of this reception to be the other side's camera 1 be identified as in step e5 has carried out sending the view data 33 of the image file of handling 30 this machine user's face (that is the user's of the other side's camera 1 face).Figure 15 is the flow chart that the corresponding detailed process order of handling of face/equipment is shown.
Shown in figure 15; In the corresponding processing of face/equipment; At first, the corresponding handling part 205 of face/equipment is according to preset image ID in the received face location data, with reference to corresponding image file 30; From the face data 350 that detects face's information 35, select the face data 350 of corresponding face location, confirm face's numbering (step g 1).Particularly, the corresponding handling part 205 of face/equipment compares face location data that received and the face location that detects the face data of setting in face's information 35 350, selects the face data 350 of consistent face location, confirms face's numbering.Then, the corresponding handling part 205 of face/equipment sends sign with corresponding image and sets " ON " for according to the image I D and the determined face numbering of the face location data that received, and upgrades the photography character image and sends information 175 (step g 3).
In addition, the corresponding handling part 205 of face/equipment is registered as the corresponding personage of face of determined face numbering and carries out this communication and owner's (step g 5) of having sent the other side's camera 1 of image file 30.Particularly, at this moment, under the control of the corresponding handling part 205 of face/equipment, image processing part 12 carries out from the view data 33 corresponding with corresponding image I D 31, cutting the processing of the face area that the face that confirms in the step g 1 numbers.The view data (face image data) that the corresponding handling part 205 of face/equipment is cut image processing part 12 is mapped with the device data that in the steps d 3 of Figure 12, receives from the other side's camera 1 of communication counterpart and is set in face/device correspondence table 177.
Then, control part 20 is used for confirming the processing (step g 7) that shows to the additional mark that sent of face that sends of this view data 33.Through the processing at this, control part 20 is as showing handling parts, shows processing with the mode of sending face and not sending face in can the recognition image data.Particularly; At first; The conduct image corresponding with the image I D of the face location data that receive sends face's numbering that sign has been set " ON " to control part 20 in the information 175 according to sending at the photography character image; Promptly image sends sign and is configured to face's numbering of " ON " and face's numbering that the conduct image transmission sign corresponding with this image I D is configured to " ON " in step g 3, from corresponding face data 350, reads face location.Then, face-image handling part 123 is according to the facial positions of being read, near the synthetic mark that sent in position this facial zone.Then, the view data of control part 20 after showing synthetic the processing on the display part 15.And though carried out synthetic processing, also can be that mark has been sent near the overlapping demonstration in position corresponding face area (OSD:On Screen Display shows on the screen) here, to sending mark.
In addition, in Figure 11, (step c3: not), control part 20 does not judge whether can receive the data from the other side's camera 1 when indication is taken in input in step c3.For example, when having set up when communicating by letter with the other side's camera 1, control part 20 is judged to be and can receives.If can not receive (step c21: not), then transfer to step c25.Then, when being judged to be (step c21: be) can receive the time, data transmit-receive handling part 204 carries out Data Receiving and handles (step c23).Figure 16 illustrates the detailed process flow chart in proper order that Data Receiving is handled.
Shown in figure 16, in Data Receiving is handled, at first be in holding state, till receiving the device data transmission request of notifying from the other side's camera 1 of having set up communication.Then; For example; If sending request (step h1: be) through receiving device data from the other side's camera 1 before the predefined stipulated time, then data transmit-receive handling part 204 is handled as follows: the device data (device id of this machine) that will be recorded in this machine data 171 in the recording portion 17 via wireless communication part 19 sends to the other side's camera 1 (step h3).Then; When sending successfully (step h5: be); Data transmit-receive handling part 204 continues to receive image file (step h7) from the other side's camera 1, if receive successfully (step h9: be), then carries out sending the processing (step h11) that receives end notification to the other side's camera 1.
Then, this machine face detection unit 244 is judged the face location request flag of the image file that is received.If there is not additional image file (the step h13: not), then transfer to step h27 that the face location request flag of " ON " is arranged.
On the other hand, when having the image file that adds the face location request flag that " ON " arranged (step h13: be), this machine face detection unit 244 will add the image file of the face location request flag of this " ON " and set process object (step h15) for.Then, carry out the processing (step h17~step h25) of circulation A as the image file of process object to each.Through the processing here, when in step h7, receiving a plurality of image file that adds the face location request flag that " ON " arranged, carry out the processing of circulation A to each this image file.
Promptly; In circulation A; At first the face's information that detects is read by person recognition portion 121 from the image file as process object, this machine face data of the face data of setting in the detection face information of being read and this machine data 171 is checked carried out person recognition processing (step h19).As having detected face's information setting under the situation of a plurality of face data, each face data is checked with this machine face data respectively.Then, if the high face data of similarity of existence and this machine face data, then person recognition portion 121 is identified as this machine user's face with its face area, and exports face as recognition result and number.
Then; When person recognition portion 121 identifies this machine user's face (step h21: be); This machine face detection unit 244 is read image I D from the image file as process object, and sends and the corresponding face location data (step h23) of face's numbering to the other side's camera 1.If the image file of whole process object has been carried out the processing of circulation A, then transfer to step h27.
Then, in step h27, the image file that data transmit-receive handling part 204 will receive in step h7 is recorded in the recording portion 17 as image file 30, and is appended to (step h27) in the image information 173.Then, turn back to the step c23 of Figure 11, and be transferred to step c25.
On the other hand, when not receiving (the step h1: not), (step h5: not), (step h9: not), transfer to step h29 when in step h9, taking defeat when in step h5, sending failure of device data when sending request among the step h1 at Figure 16.Then, in step h29, data transmit-receive handling part 204 carries out warning notice to be handled.For example, data transmit-receive handling part 204 carries out the processing of the message that on display part 15 data representing takes defeat from the image file 30 of the other side's camera 1.And, can also be the structure of output warning tones, or the structure of carrying out these processing in combination.Then, turn back to the step c23 of Figure 11, and be transferred to step c25.
Then, in the step c25 of Figure 11, control part 20 carries out the end of photograph mode to be judged, when finishing (step c25: be), transfers to step c27.(step c25: not), turn back to step c1 when not finishing photograph mode.
Then, in step c27, control part 20 makes wireless communication part 19 become halted state (closing).And, if the communication of wireless communication part 19 at this moment is not in carrying out, then not handle especially.Then, finish photograph mode and handle, turn back to the step b3 of Figure 10, and be transferred to step b9.
Then, processing describes to reproduction mode.Figure 17 illustrates the detailed process flow chart in proper order that reproduction mode is handled.Shown in figure 17, in reproduction mode was handled, at first, control part 20 started (opening) wireless communication part 19 (step I 1).Then, the image (reproduced image) (step I 2) that selection will be reproduced the image of control part 20 from be recorded in recording portion 17 as image file 30 in.Here it for example can be to select seriatim successively to be recorded in the structure of the image in the recording portion 17 according to the record order that the image that carries out is selected, or reads the thumbnail image data of a plurality of images and have a guide look of the control of demonstration and from guide look, select the structure of 1 image according to user's operation.Then, transfer to the reconstruction of image and handle (step I 3).Figure 18 illustrates the detailed process flow chart in proper order that the reconstruction of image is handled.
Shown in figure 18, in the reconstruction of image was handled, at first, control part 20 judged whether the reproduced image of in the step I 2 of Figure 17, selecting is the photographs that is registered as the photography character image.Particularly, when in photography character image transmission information 175, being set with the image I D 31 of selected image file 30, control part 20 is judged to be registered.If unregistered (step j1: be) then transfers to step j9.On the other hand, when registered (step j1: be), control part 20 judges that having or not the corresponding image of conduct to send indicates the face's numbering that is configured to " ON ".If there is not the image transmission to be masked as face's numbering (step j3: not), then transfer to step j9 of " ON ".
And in step j9, control part 20 reads in the view data 33 of reproduced image and on display part 15, reproduces the processing of (demonstration).Then, turn back to the step I 3 of Figure 17, and be transferred to step I 5.In addition, except this machine is taken and is recorded in the data in the recording portion 17, the view data of the reproduced image that the view data of the image file 30 that in the Data Receiving of Figure 16 is handled, receives from the other side's camera 1 is also included within here to be reproduced.
On the other hand; When the face that exists the image transmission to be masked as " ON " numbers (step j3: be); Control part 20 sends the face's numbering that is masked as " ON " with reference to detecting face's information 35 according to image, from corresponding face data 350, reads face location (step j5).Then, control part 20 is used for adding the processing (step j7) of having sent mark and having reproduced demonstration to the face that sends of this view data 33 according to the face location of being read.Particularly, at first, the face image handling part 123 basis face location that control part 20 is read in step j5, near the synthetic mark that sent in position its face area.Then, the view data of control part 20 after showing synthetic the processing on the display part 15.Then, turn back to the step I 3 of Figure 17, and be transferred to step I 5.
Then, in step I 5, receive the switching indication of reproduced image, when having imported the switching indication (step I 5: be), control part 20 turns back to step I 2, selects reproduced image once more, and carries out the reconstruction of image and handle (step I 3).
In addition, in this reproduction mode is handled, in step I 7, judge user's transmission indication, when having imported the transmission indication, the image file 30 that for example will be recorded in the image in the recording portion 17 sends to the other side's camera 1 of setting up communication.(step I 7: not), transfer to step I 25 when indication is not sent in input.
Then, when having imported the transmission indication (step I 7: be), at first, control part 20 is judged the transmission indication of being imported.The digital camera 1 of this execution mode constitutes; As the transmission in reproduction mode indication, for example can be unifiedly to send the transmission indication that is recorded in all images in the recording portion 17, send the transmission indication that is recorded in the image in the recording portion 17 and operate according to the user and to select and to send the transmission indication that is recorded in a plurality of images in the recording portion 17 one by one.Control part 20 is indicated the image file of selecting as sending object 30 (step I 9) according to the transmission of being imported.
Then, in following step i11, keep watch on this other side's camera 1 between the foundation of communicating by letter.At this moment, wireless communication part 19 begins to send electric wave, when the wireless communication part of the other side's camera 1 receives this electric wave, with the foundation of communicating by letter between the other side's camera 1.
Then, for example, if do not set up communicate by letter (step I 11: deny) yet with between the other side's camera 1 until having passed through the predefined stipulated time, then control part 20 carries out warning notice processing (step I 13).And after this, transfer to step I 25.For example, control part 20 is notifier processes by way of caution, and data representing is not set up the message of communication on display part 15.And, can also be the structure of output warning tones, or the structure of carrying out these processing in combination.
On the other hand, when set up with the other side's camera 1 between communicate by letter the time (step I 11: be), transfer to the communication history blotter and handle (step I 15).Likewise carrying out this communication history blotter with processing sequence shown in Figure 12 handles.
Then, in following step i17, data transmit-receive handling part 204 judges in the communication history blotter of step I 15 is handled whether generated the communication history data.Particularly; Identical with the step c17 of Figure 11; If as the communication history blotter process result of step I 15, the communication history sign that sets is " OFF ", then data transmit-receive handling part 204 is judged to be and does not generate communication history data (step I 17: not).In the case, transfer to step I 25, and do not carry out data sending processing (not being sent in the image file 30 that object is sent in the conduct of selecting according to the transmission indication of step I 7 in the step I 9) to the other side's camera 1.
Then, when the communication history generation was masked as " ON ", data transmit-receive handling part 204 was judged to be and has generated communication history data (step I 17: be), and continued to judge the number of the image file 30 of the conduct transmission object of in step I 9, selecting.Then, be that 1 (in step I 7, having indicated when sending 1 image) (step I 19: not), data transmit-receive handling part 204 carries out data sending processing (step I 21) if the image file 30 of object is sent in the conduct of in step I 9, selecting.Likewise carry out this data sending processing with processing sequence shown in Figure 13.
On the other hand; When being a plurality of (when in step I 7, having indicated unified transmission image as the image file 30 that sends object; Or select a plurality of images to send when indicating) (step I 19: be), data transmit-receive handling part 204 carries out uniform data and sends processing (step I 23).Figure 19 illustrates the flow chart that uniform data sends the detailed process order of handling.
Shown in figure 19, in uniform data send to be handled, at first, a plurality of image files 30 that will send object successively were as the circulate processing (step k1~step k5) of B of process object.That is, send determination processing (step k3) to each image file 30 that sends object respectively.Carry out this transmission determination processing with processing sequence shown in Figure 14 identically.If the image file 30 of whole transmission objects has been carried out the processing (promptly sending determination processing) of circulation B, has then transferred to step k7.
Then, in step k7, data transmit-receive handling part 204 is judged to have or not the image file 30 that will send according to the results that all send the transmission determination processing of object image files 30 that are directed against of step k1~step k5.That is, in step k1~step k5, when all being judged to be for a plurality of transmission object image files 30 when not satisfying the transmission condition, data transmit-receive handling part 204 is judged to be image file 30 (the step k7: not) that will not send.Then, turn back to the step I 23 of Figure 17, and be transferred to step I 25.
On the other hand, if there is the image file 30 that satisfies the transmission condition, then data transmit-receive handling part 204 is judged to be the image file 30 (step k7: be) that existence will be sent.In the case, image file transmission handling part 243 adds face location request flag " ON " image file 30 (step k8) that to the image file 30 that is registered as the photography character image in the image file 30 that will send, promptly in photography character image transmission information 175, is set with its image I D 31.In addition, with face location request flag " OFF " additional give in the image file 30 that will send unregistered for the image file 30 of photography character image, promptly send the image file 30 (step k9) of not setting its image I D 31 in the information 175 at the photography character image.Then, image file transmission handling part 243 carries out following processing: image file 30 unifications that will send via wireless communication part 19 send to the other side's cameras 1 (step k10).Then, data transmit-receive handling part 204 is in holding state, till receiving the reception end notification.
For example; If do not receive reception end notification (step k11: not) yet from the other side's camera 1 until having passed through the predefined stipulated time; Then data transmit-receive handling part 204 is judged to be image file 30 and sends failure, carries out warning notice and handles (step k13).For example, identical with the step e9 of Figure 13, data transmit-receive handling part 204 carries out on display part 15 data representing is sent processing from image file 30 failures to the other side's camera 1.And, can also be the structure of output warning tones, or the structure of carrying out these processing in combination.
In addition, these communication history data 360 of appending in the transmission determination processing of step k3 are deleted, update image file 30 (step k15) by communication history list update portion 242 from the communication history tabulation 36 as the image file 30 that sends object.In a plurality of image files 30, appended under the situation of communication history data 360, from each image file 30, deleted the communication history data 360 that this appends respectively.Then, turn back to the step I 23 of Figure 17, and be transferred to step I 25.
On the other hand, when the reception end notification that receives from the other side's camera 1 (step k11: be), further wait for the face location data that receive from the other side's camera 1.For example, when not receiving (step k17: not), turn back to the step I 23 of Figure 17, and be transferred to step I 25 under the face location data conditions from the other side's camera 1 yet until having passed through the predefined stipulated time.
Then, when receiving the face location data (step k17: be), to the circulate processing (step k19~step k23) of C of each the face location data that is received.That is, carry out corresponding handle (the step k21) of face/equipment respectively to each face location data.Likewise carry out the corresponding processing of this face/equipment with processing sequence shown in Figure 15.If the whole face location data that received have been carried out the processing (being that face/equipment is corresponding handles) of circulation C, have then turned back to the step I 23 of Figure 17, and be transferred to step I 25.
Then, in step I 25, control part 20 carries out the end of reproduction mode and judges that (step I 25: be) transferred to step I 27 when finishing.(step I 25: not) turn back to step I 5 when end of reproduction pattern not.
Then, in step I 27, control part 20 makes wireless communication part 19 become halted state (closing).And this moment is not if the communication of wireless communication part 19 is in carrying out, then not handle especially.Then, the end of reproduction mode treatment turns back to the step b7 of Figure 10, and is transferred to step b9.
As stated; According to this execution mode; Can be at the image file 30 that has sent the photography character image that shows character facial to the other side's camera 1 and when the other side's camera 1 receives the face location data, the face in the photography character image of confirming to be sent according to the face location data that received.And, can the personage that determined face is corresponding be registered as the owner of the other side's camera 1.Here, the face location data comprise the face location of the other side's camera 1 that receives image file 30 is identified as this machine user in this view data face's (being the user's of the other side's camera 1 face).
In addition; Can be when confirming demonstration or reproduce the view data that shows the photography character image; Carry out send face (in the face in the view data 33, this view data (being actually image file 30) is sent to the face of the corresponding external equipment that the people held (other digital camera 1) of this face) near the synthetic expression in the position processing of sending mark of having sent, and on display part 15, show processing.
On the other hand; Can receive additionally when the image file of face location request flag " ON " being arranged and identifying this machine user's who this view data, shows face from the other side's camera 1, send the face location data that this face location and image I D are mapped to the other side's camera 1 by person recognition portion 121.
Therefore, the user can confirm whether corresponding other camera 1 that the personage held of the face in view data has sent this view data.Thus, played and to have reduced the effect that repeats to send the waste of identical content data to same external equipment.
And; In last execution mode, following situation has been described, in the face that photography shows in the character image, to the external equipment that this people holds sent this view data send face near the position; The additional mark that sent is discerned demonstration and has been sent face, but is not limited to this.
Figure 20 is the figure that the display frame example of the photography character image in the variation 1 is shown.In variation 1, shown in figure 20, when confirming demonstration or reproduce demonstration photography character image, display frame is divided into 2 parts.On the image display frame W21 of upside, show the photography character image.On the other hand; The thumbnail image S21, the S23 that on the thumbnail display frame W23 of downside, show face, the face that is shown be in the photography character image that is presented on the image display frame W21 in the shown face, sent the face of its view data.Inter-process as this situation; For example at first; Control part 20 from the photography character image send read the information 175 in face's numbering corresponding with the image I D of the character image of photographing, image sends the face that is masked as " ON " and numbers, and from corresponding face data 350, reads face location.Then, image processing part 12 carries out from view data 33 cut the processing of face area according to the face location of reading here.Then, control part 20 carries out the processing of breviary demonstration face area on thumbnail pictures W23.
And, can adopt following structure, that is: obtain handling part 202 when the other side's camera 1 is obtained device data when the other side's device data, obtain owner's data such as possessory name of holding this other side's camera 1 in the lump with device data.Perhaps, also can adopt following structure, that is: owner's data of going back the minute book machine are used as this machine data 171, send request in response to the device data from the other side's camera 1 notice, and owner's data are sent to the other side's camera 1 with device data.In the case, when the face area of face has been sent in the demonstration of breviary on thumbnail display frame W23, carry out the processing of text display owner information in the lump.
In addition, in variation 1, face has been sent in the breviary demonstration on thumbnail display frame W23.Relative therewith, also can show the face that does not send by breviary.Figure 21 is the figure that the display frame example of the photography character image in the variation 2 is shown.In variation 2, shown in figure 21, identical with demonstration portrait shown in Figure 20 when confirming demonstration or reproduce demonstration photography character image, display frame is divided into 2 parts.On the image display frame W31 of upside, show the photography character image.On the other hand, in the thumbnail display frame 33 of downside, show the thumbnail image S31 of face, the face of this demonstration is a face in the shown face, that do not send its view data in the photography character image that is presented in the image display frame 31.Inter-process as this situation; For example at first; Control part 20 from the photography character image send read the information 175 in face's numbering corresponding with the image I D of the character image of photographing, image sends the face that is masked as " OFF " and numbers, and from corresponding face data 350, reads face location.Then, image processing part 12 carries out from view data 33 cut the processing of face area according to the face location of reading here.Then, control part 20 carries out the processing of breviary demonstration face area on thumbnail pictures W33.
Also can be in addition; From the other side's camera 1 do not receive under the face location data conditions, the image file 30 of the image that does not contain face beyond the character image of maybe will photographing sends under the situation of the other side's camera 1; When in face/device correspondence table 177, being set with from device data that this other side's camera 1 is obtained; Can point out corresponding face image data, send the possessory face of the transmission target external device of image file 30 as this.
Figure 22 is the flow chart that the detailed process order of the data sending processing in the variation 3 is shown.And, in Figure 22, the treatment step identical with above-mentioned execution mode marked same label.In variation 3, shown in figure 22, when (step e13: not), control part 20 does not continue to judge the device data that in face/device correspondence table 177, whether is set with the other side's camera 1 when the other side's camera 1 receives the face location data.Here; The device data that in face/device correspondence table 177, is set with the other side's camera 1 representes that past attempts is to the other side's camera 1 transmission photography character image; Receive the face location data in response to this from the other side's camera 1, in digital camera 1, registered the possessory face of the other side's camera 1.Therefore; When being set with the device data of the other side's camera 1 in face/device correspondence table 177 (step 117: be), control part 20 carries out following processing: the face image data that will be set in accordingly with the device data of the other side's camera 1 in face/device correspondence table 177 confirms to show (step 119) with the view data of being sent 33 on display part 15.On the other hand, (step 117: not), control part 20 carries out on display part 15, confirming to show the processing (step 121) of the view data 33 of being sent when not being set with the device data of the other side's camera 1 in face/device correspondence table 177.
Figure 23 is the figure of the affirmation that is illustrated in the view data 33 of carrying out in the step 119 display frame example when showing.In variation 3, shown in figure 23, when confirming to show that this sends to the image of the other side's camera 1, likewise display frame is divided into 2 parts with Figure 20 and the illustrative display frame of Figure 21.On the image display frame W41 of upside, show the image that is sent.In Figure 23, show the image that photographs the landscape that does not contain the personage.On the other hand; The thumbnail S41 (that is, the device data with the other side's camera 1 is set in the face image data in face/device correspondence table 177 accordingly) that in the thumbnail display frame 43 of downside, shows the possessory face that holds the other side's camera 1 that sends target.
Like this; According to this variation 3; As long as the owner as external equipment (other digital camera 1) has registered face image data in face/device correspondence table 177; Then communicating as the other side's camera 1 with this digital camera 1 subsequently, when sending image file 30, can picture showing the user's of the other side's camera 1 face.
In addition, in the above-described embodiment, to being that the situation of communication counterpart transmitting-receiving image file 30 is illustrated with other digital camera 1, but to not limiting as the external equipment of communication counterpart is special.Figure 24 is that digital camera 1 is shown is the figure of communication counterpart situation about communicating with personal computer (notebook) 90.Here, personal computer 90 be built-in with can and the wireless communication part 19 of digital camera 1 between the wireless communication part 901 that communicates.This wireless communication part 901 is configured to its antenna and is positioned at keyboard face side.And, in the time of in digital camera 1 and personal computer 90 approach to the zone that wireless communication part 19 and wireless communication part 901 can communicate by letter each other, between separately, set up and communicate by letter.Under the situation (as the situation of sending target) that with external equipments such as personal computers is communication counterpart like this, as long as this external-device response is carried out the data transmit-receive processing of Figure 16 in the notice that the device data from digital camera 1 sends request.
In addition; In the above-described embodiment; Example as portable equipment of the present invention; With the digital camera is that example is illustrated, but also applicable to the mobile phone that has camera-enabled, game machine, music player, recording device, notebook etc. can with other portable equipment of external equipment transceive data.
According to the present invention, can take the photograph the external equipment of the view data of body from the quilt that transmission contains face, obtain the relevant information of user face that comprise and external equipment in the view data of being sent.And, can the basis information relevant with the face that is obtained, the demonstration that comes update displayed portion.Therefore, user user's the face that can confirm in the face in the view data, send the external equipment of view data.Thus, played and to have reduced the effect that repeats to send the waste of identical content data to same external equipment.

Claims (5)

1. a portable equipment is characterized in that, this portable equipment has:
Image pickup part, it is taken the photograph body to the quilt that contains face and takes, and generates view data;
Display part, it shows the said view data that is generated by said image pickup part;
Communicate between the Department of Communication Force, itself and external equipment;
Send handling part, it carries out will being presented at the processing that view data on the said display part sends to said external equipment via said Department of Communication Force; And
Show handling part; It carries out following processing: after said transmission handling part sends said view data; Via said Department of Communication Force from said external equipment obtain with said view data user's the relevant information of face of the said external equipment that comprises, and upgrade the demonstration of said display part according to the relevant information of said and face that this is obtained.
2. portable equipment according to claim 1 is characterized in that,
Said demonstration handling part carries out following processing: carry out confirming the demonstration of the face shown in said relevant with the face information in the said display part institute images displayed data, upgrade the demonstration of said display part.
3. portable equipment according to claim 2 is characterized in that,
Through outside the display box of said view data, showing the face shown in the said information relevant, or in said view data, showing the frame that surrounds the face shown in said and the information that face is correlated with, carry out the said demonstration that can confirm with face.
4. portable equipment according to claim 1 is characterized in that,
The said information relevant with face is the positional information of having been carried out sending the face that comprises in the image data processed by said transmission handling part.
5. portable equipment according to claim 1 is characterized in that,
This portable equipment has storage part, and it is stored equipment/face's correspondence table, and this equipment/face's correspondence table has been set face image data accordingly with the device data that is used to discern said external equipment,
The said information relevant with face is to discern the device data of the said external equipment that communicates via said Department of Communication Force,
Said demonstration handling part carries out following processing: from said equipment/face's correspondence table, read the face image data corresponding with said device data; And upgrading the demonstration of said display part according to the face image data that this is read, this device data is obtained as the said information relevant with face.
CN2010101216024A 2009-03-24 2010-03-11 Portable device Expired - Fee Related CN101848324B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210244282.0A CN102769733B (en) 2009-03-24 2010-03-11 Portable equipment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009072415A JP5331532B2 (en) 2009-03-24 2009-03-24 Portable device
JP2009-072415 2009-03-24

Related Child Applications (2)

Application Number Title Priority Date Filing Date
CN201210244285.4A Division CN102769734B (en) 2009-03-24 2010-03-11 Portable equipment
CN201210244282.0A Division CN102769733B (en) 2009-03-24 2010-03-11 Portable equipment

Publications (2)

Publication Number Publication Date
CN101848324A CN101848324A (en) 2010-09-29
CN101848324B true CN101848324B (en) 2012-09-05

Family

ID=42772767

Family Applications (3)

Application Number Title Priority Date Filing Date
CN2010101216024A Expired - Fee Related CN101848324B (en) 2009-03-24 2010-03-11 Portable device
CN201210244282.0A Expired - Fee Related CN102769733B (en) 2009-03-24 2010-03-11 Portable equipment
CN201210244285.4A Expired - Fee Related CN102769734B (en) 2009-03-24 2010-03-11 Portable equipment

Family Applications After (2)

Application Number Title Priority Date Filing Date
CN201210244282.0A Expired - Fee Related CN102769733B (en) 2009-03-24 2010-03-11 Portable equipment
CN201210244285.4A Expired - Fee Related CN102769734B (en) 2009-03-24 2010-03-11 Portable equipment

Country Status (2)

Country Link
JP (1) JP5331532B2 (en)
CN (3) CN101848324B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5769434B2 (en) * 2011-02-03 2015-08-26 キヤノン株式会社 Movie recording device, information processing device
JP5717459B2 (en) * 2011-02-04 2015-05-13 キヤノン株式会社 Image recording apparatus, information processing apparatus, control method thereof, and program thereof
TWI575978B (en) * 2011-07-05 2017-03-21 宏達國際電子股份有限公司 Wireless service providing method
CN103591894B (en) * 2013-11-05 2017-07-11 广东欧珀移动通信有限公司 The method and apparatus of object length is measured by camera

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1574957A (en) * 2003-05-29 2005-02-02 卡西欧计算机株式会社 Photographed image transmitting apparatus
CN1836439A (en) * 2003-09-01 2006-09-20 松下电器产业株式会社 Camera having transmission function, mobile telephone device, and image data acquiring/transmitting program
CN101010941A (en) * 2004-09-01 2007-08-01 株式会社尼康 Electronic camera system, phtographing ordering device and phtographing system
CN101453605A (en) * 2007-12-07 2009-06-10 佳能株式会社 Imaging device and control method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004201191A (en) * 2002-12-20 2004-07-15 Nec Corp Image processing and transmitting system, cellular phone, and method and program for image processing and transmission
JP4522344B2 (en) * 2004-11-09 2010-08-11 キヤノン株式会社 Imaging apparatus, control method thereof, and program thereof
JP2006293912A (en) * 2005-04-14 2006-10-26 Toshiba Corp Information display system, information display method and portable terminal device
JP4315148B2 (en) * 2005-11-25 2009-08-19 株式会社ニコン Electronic camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1574957A (en) * 2003-05-29 2005-02-02 卡西欧计算机株式会社 Photographed image transmitting apparatus
CN1836439A (en) * 2003-09-01 2006-09-20 松下电器产业株式会社 Camera having transmission function, mobile telephone device, and image data acquiring/transmitting program
CN101010941A (en) * 2004-09-01 2007-08-01 株式会社尼康 Electronic camera system, phtographing ordering device and phtographing system
CN101453605A (en) * 2007-12-07 2009-06-10 佳能株式会社 Imaging device and control method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JP特开2006-166408A 2006.06.22

Also Published As

Publication number Publication date
CN102769733B (en) 2015-07-08
CN102769734B (en) 2015-04-15
CN102769733A (en) 2012-11-07
CN101848324A (en) 2010-09-29
JP5331532B2 (en) 2013-10-30
CN102769734A (en) 2012-11-07
JP2010226498A (en) 2010-10-07

Similar Documents

Publication Publication Date Title
US11889180B2 (en) Photographing method and electronic device
CN111866404B (en) Video editing method and electronic equipment
CN101753809B (en) Portable device
KR101771153B1 (en) Method and device for determining associated user
CN113727012A (en) Shooting method and terminal
CN104317932A (en) Photo sharing method and device
CN102067127A (en) Method and apparatus for tagging images and providing notifications when images are tagged
CN110471606B (en) Input method and electronic equipment
CN105302315A (en) Image processing method and device
EP4060603A1 (en) Image processing method and related apparatus
CN106023083A (en) Method and device for obtaining combined image
CN104463103A (en) Image processing method and device
CN101848324B (en) Portable device
CN105488829B (en) Generate the method and device of head portrait
CN105203456A (en) Plant species identification method and apparatus thereof
CN108920113A (en) Video frame images Method of printing, device and computer readable storage medium
CN106453846A (en) Display method and device for two-dimensional code business card
US20220215050A1 (en) Picture Search Method and Device
JP5857276B1 (en) Authentication system for authenticating wearable cameras and their users
JP2016122918A (en) Wearable camera
CN105095213B (en) Information correlation method and device
CN105426904A (en) Photo processing method, apparatus and device
CN117785340A (en) Card sharing method and device
KR20150094389A (en) Method and apparatus for providing service of context information by using camera
US11715328B2 (en) Image processing apparatus for selecting images based on a standard

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C41 Transfer of patent application or patent right or utility model
TR01 Transfer of patent right

Effective date of registration: 20151125

Address after: Tokyo, Japan, Japan

Patentee after: Olympus Corporation

Address before: Tokyo, Japan, Japan

Patentee before: Olympus Imaging Corp.

CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20120905

Termination date: 20190311

CF01 Termination of patent right due to non-payment of annual fee