[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

JP2012053366A - Imaging apparatus and program - Google Patents

Imaging apparatus and program Download PDF

Info

Publication number
JP2012053366A
JP2012053366A JP2010197240A JP2010197240A JP2012053366A JP 2012053366 A JP2012053366 A JP 2012053366A JP 2010197240 A JP2010197240 A JP 2010197240A JP 2010197240 A JP2010197240 A JP 2010197240A JP 2012053366 A JP2012053366 A JP 2012053366A
Authority
JP
Japan
Prior art keywords
user
imaging
situation
housing
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2010197240A
Other languages
Japanese (ja)
Other versions
JP5633679B2 (en
JP2012053366A5 (en
Inventor
Tetsuya Hayashi
林  哲也
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Priority to JP2010197240A priority Critical patent/JP5633679B2/en
Publication of JP2012053366A publication Critical patent/JP2012053366A/en
Publication of JP2012053366A5 publication Critical patent/JP2012053366A5/ja
Application granted granted Critical
Publication of JP5633679B2 publication Critical patent/JP5633679B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Exposure Control For Cameras (AREA)
  • Details Of Cameras Including Film Mechanisms (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
  • Indication In Cameras, And Counting Of Exposures (AREA)
  • Studio Devices (AREA)

Abstract

PROBLEM TO BE SOLVED: To realize an imaging apparatus capable of performing an automatic start adapted to the intention of a user.SOLUTION: A first touch sensor 19, a second touch sensor 20, and a first imaging section 13 for imaging the surroundings of the rear side of a housing are provided on the rear side of the housing. A third touch sensor and a second imaging section for imaging the surroundings of the front side of the housing are provided on the front side of the housing. When an imaging apparatus 100 is turned into an idle state (a power source is off), a control section 10 automatically starts in an operation mode (a photographic mode for photographing a subject/a playback mode for displaying a photographic image) that matches the behavior of a user considered based on the touch detection results of the first touch sensor 19 to the third touch sensor and the surrounding states respectively obtained from imaging outputs of the first imaging section 13 and the second imaging section. Accordingly, the automatic start adapted to the intention of the user can be performed.

Description

本発明は、ユーザの意図に即した自動起動が可能な撮像装置およびプログラムに関する。   The present invention relates to an imaging apparatus and a program that can be automatically activated in accordance with a user's intention.

電源オフされて撮影停止状態にある撮影装置を迅速に撮影可能な状態に設定する技術が知られている。この種の技術として、例えば特許文献1には、カメラ本体のグリップにタッチセンサを設け、当該タッチセンサによりユーザがグリップを把持したことを検知すると、各部に電力供給して即撮影可能状態に移行させる技術が開示されている。   There is known a technique for setting a photographing apparatus that is turned off and in a photographing stopped state to a state in which photographing can be performed quickly. As this type of technology, for example, in Patent Document 1, a touch sensor is provided on the grip of the camera body, and when the touch sensor detects that the user has gripped the grip, power is supplied to each unit and the state immediately shifts to a ready-to-shoot state. Techniques for making them disclosed are disclosed.

特開2003−295282号公報JP 2003-295282 A

ところで、上記特許文献1に開示の技術では、ユーザが撮影する意図もなく単にカメラ本体のグリップを握っただけで一意的に即撮影可能状態に移行してしまう為、ユーザの意図に即した自動起動を行うことが出来ないという問題がある。   By the way, in the technique disclosed in the above-mentioned patent document 1, since the user is not intentionally photographed and simply shifts to a state in which immediate photographing is possible simply by grasping the grip of the camera body, There is a problem that it cannot be started.

本発明は、このような事情に鑑みてなされたもので、ユーザの意図に即した自動起動を行うことができる撮像装置およびプログラムを提供することを目的としている。   The present invention has been made in view of such circumstances, and an object thereof is to provide an imaging apparatus and a program that can perform automatic activation in accordance with a user's intention.

上記目的を達成するため、請求項1に記載の発明では、装置筐体の背面側周囲の状況を取得する第1の状況取得手段と、装置筐体の正面側周囲の状況を取得する第2の状況取得手段と、ユーザが装置筐体を把持する状態を検出する把持状態検出手段と、前記第1の状況取得手段により取得された背面側周囲の状況と、前記第2の状況取得手段により取得された正面側周囲の状況と、前記把持状態検出手段により検出され、ユーザが装置筐体を把持する状態とに基づいて勘案したユーザの挙動に合致する動作モードで自装置を自動的に起動する起動制御手段とを具備することを特徴とする。   In order to achieve the above object, according to the first aspect of the present invention, the first situation acquisition means for acquiring the situation around the back side of the apparatus casing and the second condition for obtaining the situation around the front side of the apparatus casing. Status acquisition means, a gripping state detection means for detecting a state in which the user grips the apparatus housing, a situation on the back side obtained by the first status acquisition means, and the second status acquisition means The device is automatically activated in an operation mode that matches the user's behavior, which is detected based on the acquired front-side surrounding conditions and the state in which the user grasps the device housing, which is detected by the gripping state detection means. And an activation control means for performing the above.

上記請求項1に従属する請求項2に記載の発明では、前記起動制御手段は、ユーザの挙動に合致する動作モードとして、撮影画像を取得する撮影モードあるいは取得した撮影画像を表示する再生モードの何れかで自装置を自動的に起動することを特徴とする。   In the invention according to claim 2 subordinate to claim 1, the activation control means includes an imaging mode for acquiring a captured image or a playback mode for displaying the acquired captured image as an operation mode that matches the user's behavior. One of the features is that the own apparatus is automatically activated.

上記請求項1に従属する請求項3に記載の発明では、前記第1の状況取得手段は、取得した背面側周囲の状況からユーザの顔を検出したか否かを判定する顔検出手段を有することを特徴とする。   In the invention according to claim 3, which is dependent on claim 1, the first situation acquisition means has face detection means for determining whether or not a user's face has been detected from the acquired situation on the back side. It is characterized by that.

上記請求項3に従属する請求項4に記載の発明では、前記顔検出手段は、背面側周囲の状況からユーザの顔を検出した場合に、その検出した顔の視線の向きを検出する視線検出手段を更に備えることを特徴とする。   In the invention according to claim 4 subordinate to claim 3, the face detection means detects the direction of the line of sight of the detected face when the face of the user is detected from the situation around the back side. The apparatus further comprises means.

上記請求項1に従属する請求項5に記載の発明では、前記第2の状況取得手段は、取得した正面側周囲の状況から被写体を検出したか否かを判定する被写体検出手段を有することを特徴とする。   In the invention according to claim 5 subordinate to claim 1, the second situation acquisition means includes subject detection means for determining whether or not a subject has been detected from the acquired situation around the front side. Features.

上記請求項1に従属する請求項6に記載の発明では、前記把持状態検出手段は、装置筐体の複数箇所におけるユーザの接触操作を検出し、検出したユーザの接触操作から当該ユーザが装置筐体を把持する状態を検出することを特徴とする。   In the invention according to claim 6, which is dependent on claim 1, the gripping state detection means detects a user's contact operation at a plurality of locations of the apparatus housing, and the user detects the apparatus housing from the detected user contact operation. It is characterized by detecting a state of gripping the body.

請求項7に記載の発明では、コンピュータに、装置筐体の背面側周囲の状況を取得する第1の状況取得ステップと、装置筐体の正面側周囲の状況を取得する第2の状況取得ステップと、ユーザが装置筐体を把持する状態を検出する把持状態検出ステップと、前記第1の状況取得ステップにより取得された背面側周囲の状況と、前記第2の状況取得ステップにより取得された正面側周囲の状況と、前記把持状態検出ステップにより検出され、ユーザが装置筐体を把持する状態とに基づいて勘案したユーザの挙動に合致する動作モードで自装置を自動的に起動する起動制御ステップとを実行させることを特徴とする。   According to the seventh aspect of the present invention, the first situation acquisition step for acquiring the situation around the back side of the apparatus casing and the second situation acquisition step for acquiring the situation around the front side of the apparatus casing. A gripping state detection step for detecting a state in which the user grips the apparatus housing, a situation around the back side obtained by the first situation acquisition step, and a front face obtained by the second situation acquisition step Activation control step of automatically starting the apparatus in an operation mode that matches the user's behavior taken into consideration based on the situation around the side and the state in which the user grasps the apparatus housing based on the grasping state detection step Are executed.

本発明では、ユーザの意図に即した自動起動を行うことが出来る。   In the present invention, automatic activation can be performed in accordance with the user's intention.

実施の一形態による撮像装置100の外観を示す斜視図である。1 is a perspective view illustrating an appearance of an imaging apparatus 100 according to an embodiment. 撮像装置100の外観を示す正面図である。1 is a front view illustrating an appearance of an imaging apparatus 100. FIG. 撮像装置100の全体構成を示すブロック図である。1 is a block diagram illustrating an overall configuration of an imaging apparatus 100. FIG. 撮像装置100の制御部10が実行する主電源オフ時処理の動作を示すフローチャートである。5 is a flowchart illustrating an operation of main power-off processing executed by the control unit 10 of the imaging apparatus 100.

以下、図面を参照して本発明の実施形態について説明する。
A.構成
(1)外観構成
図1および図2は、本発明の実施の一形態による撮像装置100の背面側外観を示す斜視図および正面図である。図1に図示するように、撮像装置100の背面側には、ユーザが装置筐体Bを構える際に左右の手指で把持される箇所に、第1タッチセンサ19および第2タッチセンサ20が配設されると共に、装置筐体Bの背面側周囲を撮像する第1撮像部13が設けられる。装置筐体Bの正面側には、図2に図示するように、ユーザが装置筐体Bを構える際に右手指で把持される箇所に第3タッチセンサ21が設けられると共に、被写体撮像専用として用いられる第3撮像部15の対物レンズ近傍に、装置筐体Bの正面側周囲を撮像する第2撮像部14が設けられる。これら第1撮像部13、第2撮像部14および第1タッチセンサ19〜第3タッチセンサ21を備える意図については追って説明する。
Hereinafter, embodiments of the present invention will be described with reference to the drawings.
A. Configuration (1) Appearance Configuration FIG. 1 and FIG. 2 are a perspective view and a front view showing a rear side appearance of an imaging apparatus 100 according to an embodiment of the present invention. As shown in FIG. 1, the first touch sensor 19 and the second touch sensor 20 are arranged on the back side of the imaging device 100 at locations where the user holds the device housing B with the left and right fingers. And a first imaging unit 13 that captures an image around the back side of the apparatus housing B. On the front side of the device casing B, as shown in FIG. 2, a third touch sensor 21 is provided at a location where the user holds the device casing B with his / her right finger and is dedicated to subject imaging. In the vicinity of the objective lens of the third imaging unit 15 to be used, a second imaging unit 14 that images the periphery of the front side of the apparatus housing B is provided. The intention of providing the first imaging unit 13, the second imaging unit 14, and the first touch sensor 19 to the third touch sensor 21 will be described later.

(2)電気的構成
図3は、撮像装置100の電気的構成を示すブロック図である。図3において、制御部10はCPU等から構成され、装置各部を制御する。本発明の要旨に係わる制御部10の特徴的な処理動作については追って詳述する。操作部11は、例えば長押し動作で電源のシャットダウンを指示したり、オンオフ操作に応じて電源のオンオフを指示したりする電源スイッチや、撮影モード又は再生モードを選択的に切り替えるモードダイアル、表示部17に画面表示されるメニュー項目やアイコンをポインティングする際に操作されるカーソルキー、ズーム操作されるズームレバーおよび半押し状態で合焦指示を発生し、全押し状態で撮影指示を発生するシャッタボタン等の操作子を備え、これら操作子の操作に応じたイベントを発生する。操作部11が発生するイベントは制御部10により取り込まれる。
(2) Electrical Configuration FIG. 3 is a block diagram showing an electrical configuration of the imaging apparatus 100. In FIG. 3, the control unit 10 includes a CPU and the like, and controls each unit of the apparatus. The characteristic processing operation of the control unit 10 according to the gist of the present invention will be described in detail later. The operation unit 11 is, for example, a power switch for instructing a power shutdown by a long press operation or instructing a power on / off in response to an on / off operation, a mode dial for selectively switching a shooting mode or a playback mode, and a display unit 17 is a cursor key that is operated when pointing to a menu item or icon displayed on the screen, a zoom lever that is operated by zooming, and a shutter button that generates a focusing instruction when pressed halfway and a shooting instruction when pressed fully. Etc., and an event corresponding to the operation of these operators is generated. Events generated by the operation unit 11 are captured by the control unit 10.

電源部12は、電源スイッチの操作に基づき制御部10が発生する制御信号に従って装置各部に供給する電力をシャットダウン又はオンオフさせる。具体的には、シャットダウンでは装置各部の全てについて電力供給を遮断し、電源オンの起動状態では装置各部の全てに電力供給し、電源オフのアイドル状態では第3撮像部15および表示部17を除く各部に電力供給する。   The power supply unit 12 shuts down or on / off the power supplied to each unit in accordance with a control signal generated by the control unit 10 based on the operation of the power switch. Specifically, in the shutdown, the power supply is cut off for all the parts of the device, in the power-on startup state, the power is supplied to all the parts of the device, and in the idle state where the power is off, the third imaging unit 15 and the display unit 17 are excluded Power is supplied to each part.

第1撮像部13は、装置筐体Bの背面側周囲を撮像して撮像信号を発生する。第2撮像部14は、装置筐体Bの正面側周囲を撮像して撮像信号を発生する。第3撮像部15は、ズーム機能およびオートフォーカス機能を具現し、ユーザ操作に応じてフォーカシングされた被写体を撮像して撮像信号を発生する。   The first imaging unit 13 captures an image around the back side of the apparatus housing B and generates an imaging signal. The second imaging unit 14 captures an image of the periphery of the front side of the device casing B and generates an imaging signal. The third image pickup unit 15 implements a zoom function and an autofocus function, and picks up an imaged subject in response to a user operation and generates an image pickup signal.

画像処理部16は、制御部10の指示に従い、第1撮像部13〜第3撮像部15から各々出力される撮像信号について選択的に画像処理を施す。具体的には、アイドル状態(電源オフ)において制御部10が後述の第1タッチセンサ19〜第3タッチセンサ21の出力に基づきユーザの撮影動作(後述する)を検出すると、画像処理部16は制御部10の指示に従い、第1撮像部13および第2撮像部14からそれぞれ出力される各撮像信号に基づき背面画像および正面画像を発生する。   The image processing unit 16 selectively performs image processing on the imaging signals output from the first imaging unit 13 to the third imaging unit 15 in accordance with instructions from the control unit 10. Specifically, when the control unit 10 detects a user's shooting operation (described later) based on outputs from a first touch sensor 19 to a third touch sensor 21 described later in an idle state (power off), the image processing unit 16 In accordance with an instruction from the control unit 10, a rear image and a front image are generated based on the respective imaging signals output from the first imaging unit 13 and the second imaging unit 14, respectively.

また、アイドル状態(電源オフ)において制御部10が後述の第2タッチセンサ20および第3タッチセンサ21の出力に基づきユーザの保持動作(後述する)を検出すると、画像処理部16は制御部10の指示に従い、第1撮像部13から出力される撮像信号に基づき生成した背面画像に顔検出を施す。顔検出により背面画像からユーザの顔を検出した場合には、検出された顔の向きと瞳の位置とに応じて、ユーザの視線の向きを検出する視線検出を施す。   In addition, when the control unit 10 detects a holding operation (described later) based on outputs from a second touch sensor 20 and a third touch sensor 21 described later in the idle state (power off), the image processing unit 16 controls the control unit 10. , Face detection is performed on the rear image generated based on the imaging signal output from the first imaging unit 13. When a user's face is detected from the back image by face detection, line-of-sight detection is performed to detect the direction of the user's line of sight according to the detected face direction and pupil position.

さらに、アイドル状態(電源オフ)において制御部10が後述の第3タッチセンサ21の出力に基づき把持動作(後述する)を検出すると、画像処理部16は制御部10の指示に従い、第2撮像部14から出力される撮像信号に基づき生成した正面画像に被写体検出を施す。   Further, when the control unit 10 detects a gripping operation (described later) based on an output of a third touch sensor 21 described later in the idle state (power off), the image processing unit 16 follows the instruction of the control unit 10 and the second imaging unit. Subject detection is performed on the front image generated based on the imaging signal output from 14.

加えて、画像処理部16は、起動状態(電源オン)の撮影モードでは、第3撮像部15から出力される撮像信号に基づき生成される撮影画像をビデオ信号に変換して表示部17に供給し、当該表示部17をビューファインダとして機能させ、シャッタボタン押下に応じて制御部10が撮影指示を発生すると、その時点で取り込まれている1フレーム分の撮影画像を圧縮符号化して記憶部18に記録する。一方、起動状態(電源オン)の再生モードでは、ユーザ操作に応じて記憶部18から選択的に読み出される圧縮撮影画像を伸張し、伸張された撮影画像をビデオ信号に変換して表示部17に再生表示させる。   In addition, the image processing unit 16 converts the captured image generated based on the imaging signal output from the third imaging unit 15 into a video signal and supplies the video signal to the display unit 17 in the imaging mode in the activated state (power on). Then, when the display unit 17 functions as a viewfinder and the control unit 10 generates a shooting instruction in response to pressing of the shutter button, the captured image for one frame captured at that time is compressed and encoded, and the storage unit 18. To record. On the other hand, in the playback mode in the activated state (power on), the compressed photographed image that is selectively read out from the storage unit 18 according to a user operation is decompressed, and the decompressed photographed image is converted into a video signal and displayed on the display unit 17. Display playback.

表示部17は、カラー液晶パネル等から構成され、制御部10の制御の下に、上述の画像処理部16が発生するビデオ信号を画面表示する。記憶部18は、ROM、RAMおよびフラッシュメモリ等から構成され、制御部10が実行する各種制御プログラムや制御データを記憶するエリア、制御部10のワークエリア、画像処理部16のDMA転送エリアやVRAMエリア等として用いられる他、圧縮撮影画像を記録保存する内蔵メモリ(又は外部メモリ)として用いられる。   The display unit 17 includes a color liquid crystal panel and the like, and displays the video signal generated by the image processing unit 16 on the screen under the control of the control unit 10. The storage unit 18 includes a ROM, a RAM, a flash memory, and the like. The storage unit 18 stores various control programs and control data executed by the control unit 10, a work area of the control unit 10, a DMA transfer area of the image processing unit 16, and a VRAM. In addition to being used as an area or the like, it is used as a built-in memory (or external memory) for recording and storing compressed photographed images.

第1タッチセンサ19〜第3タッチセンサ21は、それぞれ静電容量方式で構成される周知のタッチスイッチである。第1タッチセンサ19および第2タッチセンサ20は、単一接触のタッチ操作を検出する。第3タッチセンサ21は、単一接触のタッチ操作又は複数接触のマルチタッチ操作を検出する。これら第1タッチセンサ19〜第3タッチセンサ21の検出出力は、制御部10に取り込まれる。制御部10では、取り込んだ検出出力に基づきユーザ動作を推定する。例えば第1タッチセンサ19および第2タッチセンサ20が共にタッチ操作を検出し、かつ第3タッチセンサ21がマルチタッチ操作を検出した場合には、ユーザが左右の手指で装置筐体Bを構える撮影動作と見なす。   The first touch sensor 19 to the third touch sensor 21 are well-known touch switches each configured by a capacitance method. The first touch sensor 19 and the second touch sensor 20 detect a single touch touch operation. The third touch sensor 21 detects a single-touch touch operation or a multi-touch multi-touch operation. The detection outputs of the first touch sensor 19 to the third touch sensor 21 are taken into the control unit 10. The control unit 10 estimates a user action based on the captured detection output. For example, when both the first touch sensor 19 and the second touch sensor 20 detect a touch operation and the third touch sensor 21 detects a multi-touch operation, the user holds the apparatus housing B with left and right fingers. Consider it an action.

B.動作
次に、図4を参照して上記構成による撮像装置100の動作を説明する。図4に図示するフローチャートは、撮像装置100の制御部10がアイドル状態(電源オフ)で実行する電源オフ時処理の動作を示す。ユーザの電源スイッチ操作に応じて、電源部12が第3撮像部15および表示部17を除く各部に電力供給するアイドル状態になると、本処理が一定周期毎に実行される。
B. Operation Next, the operation of the imaging apparatus 100 configured as described above will be described with reference to FIG. The flowchart illustrated in FIG. 4 illustrates an operation of power-off processing that is executed by the control unit 10 of the imaging apparatus 100 in an idle state (power-off). When the power supply unit 12 enters an idle state in which power is supplied to each unit other than the third imaging unit 15 and the display unit 17 in accordance with the user's power switch operation, this processing is executed at regular intervals.

本処理の実行タイミングになると、制御部10はステップS1に処理を進め、第1タッチセンサ19および第2タッチセンサ20が共にタッチ操作を検出し、かつ第3タッチセンサ21がマルチタッチ操作を検出したか否か、すなわちユーザが左右の手指で装置筐体Bを構える撮影動作を行っているかどうかを判断する。ユーザが撮影動作を行っていれば、ここでの判断結果は「YES」となり、ステップS2に進む。   At the execution timing of this processing, the control unit 10 proceeds to step S1, the first touch sensor 19 and the second touch sensor 20 both detect a touch operation, and the third touch sensor 21 detects a multi-touch operation. It is determined whether or not the user is performing a shooting operation with the apparatus housing B held by the left and right fingers. If the user is performing a shooting operation, the determination result here is “YES”, and the flow proceeds to step S2.

ステップS2では、第1撮像部13および第2撮像部14が共に暗闇を撮像しているか否かを判断する。例えば撮像装置100がカバーケースなどに収納され、これにより装置筐体Bの背面側周囲を撮像する第1撮像部13と、装置筐体Bの正面側周囲を撮像する第2撮像部14とが共に暗闇を撮像すると、判断結果は「YES」になり、アイドル状態(電源オフ)を維持する。つまりケース収納等の撮影不適な状況で撮影動作をしたとして、アイドル状態(電源オフ)を維持する。   In step S2, it is determined whether or not both the first imaging unit 13 and the second imaging unit 14 are imaging darkness. For example, the imaging apparatus 100 is housed in a cover case or the like, and thereby, a first imaging unit 13 that captures the periphery of the back side of the apparatus housing B and a second imaging unit 14 that captures the periphery of the front side of the apparatus housing B are provided. If both of them capture darkness, the determination result is “YES”, and the idle state (power off) is maintained. In other words, the idle state (power off) is maintained even if the photographing operation is performed in a situation unsuitable for photographing such as case storage.

これに対し、第1撮像部13および第2撮像部14が共に暗闇を撮像していなければ、上記ステップS2の判断結果は「NO」になり、この場合には撮影可能な状況で撮影動作をしたとして、電源部12に電源オン(起動状態)を指示すると共に、第3撮像部15により被写体を撮影する撮影モードの起動を画像処理部16に指示する。この結果、ユーザが左右の手指で装置筐体Bを構えるだけで撮影モードで自動起動される。   On the other hand, if both the first imaging unit 13 and the second imaging unit 14 are not capturing darkness, the determination result in step S2 is “NO”. In this case, the shooting operation is performed in a situation where shooting is possible. As a result, the power supply unit 12 is instructed to turn on the power (in the activated state), and the third imaging unit 15 is instructed to activate the imaging mode for capturing the subject. As a result, the user is automatically activated in the shooting mode only by holding the apparatus housing B with the left and right fingers.

一方、ユーザが撮影動作を行っていない場合には、上記ステップS1の判断結果は「NO」になり、ステップS3に進む。ステップS3では、第2タッチセンサ20および第3タッチセンサ21が共にタッチ操作を検出したか否か、すなわちユーザが左右の手指で筐体Bを保持する保持動作を行っているかどうかを判断する。   On the other hand, when the user is not performing the photographing operation, the determination result in Step S1 is “NO”, and the process proceeds to Step S3. In step S3, it is determined whether or not both the second touch sensor 20 and the third touch sensor 21 have detected a touch operation, that is, whether or not the user is performing a holding operation of holding the casing B with left and right fingers.

ユーザが保持動作を行っていれば、判断結果は「YES」になり、ステップS4に進み、装置筐体Bの背面側周囲を撮像する第1撮像部13から出力される撮像信号に基づき生成した背面画像に顔検出を施すよう画像処理部16に指示する。次いで、ステップS5では、画像処理部16が背景画像からユーザの顔を検出したか否かを判断する。   If the user is performing the holding operation, the determination result is “YES”, and the process proceeds to step S4, where the determination is generated based on the imaging signal output from the first imaging unit 13 that images the periphery of the back side of the apparatus housing B. The image processing unit 16 is instructed to perform face detection on the back image. In step S5, the image processing unit 16 determines whether the user's face is detected from the background image.

背景画像からユーザの顔を検出することが出来なければ、上記ステップS5の判断結果は「NO」になり、後述するステップS7に進む。これに対し、背景画像からユーザの顔を検出すると、上記ステップS5の判断結果が「YES」となり、この場合には表示部17の表示画面を見るための保持動作をしたとして、電源部12に電源オン(起動状態)を指示すると共に、再生モードの起動を画像処理部16に指示する。この結果、ユーザが左右の手指で装置筐体Bを保持するだけで再生モードで自動起動される。   If the user's face cannot be detected from the background image, the determination result in step S5 is “NO”, and the process proceeds to step S7 described later. On the other hand, when the user's face is detected from the background image, the determination result in step S5 is “YES”. In this case, the holding operation for viewing the display screen of the display unit 17 is performed. Instructing the power-on (start-up state) and instructing the image processing unit 16 to start the playback mode. As a result, the user is automatically activated in the reproduction mode only by holding the apparatus housing B with the left and right fingers.

なお、画像処理部16が背景画像からユーザの顔を検出した場合に、その検出した顔の向きと瞳の位置とに応じて視線の向きを検出する視線検出を施し、検出された視線の向きが表示部17の表示画面に正対する場合に、電源部12に電源オン(起動状態)を指示すると共に、再生モードの起動を画像処理部16に指示する態様としてもよい。このようにすれば、ユーザが左右の手指で装置筐体Bを保持し、かつ表示画面を見る動作を行った時にのみ再生モードで自動起動させることが可能になる。   When the image processing unit 16 detects the face of the user from the background image, it performs line-of-sight detection that detects the direction of the line of sight according to the detected face direction and pupil position, and the detected line-of-sight direction May face the display screen of the display unit 17, the power source unit 12 may be instructed to turn on the power (in an activated state), and the image processing unit 16 may be instructed to activate the playback mode. In this way, it is possible to automatically start in the playback mode only when the user holds the apparatus housing B with the left and right fingers and performs an operation of viewing the display screen.

さて一方、ユーザが左右の手指で装置筐体Bを保持する保持動作を行っていなければ、上述したステップS3の判断結果は「NO」になり、ステップS6に進む。ステップS6では、第3タッチセンサ21がタッチ操作を検出したか否か、すなわちユーザが右手だけで装置筐体Bを把持する把持動作を行っているかどうかを判断する。把持動作を行っていなければ、判断結果は「NO」になり、アイドル状態(電源オフ)を維持する。   On the other hand, if the user has not performed the holding operation of holding the apparatus housing B with the left and right fingers, the determination result in Step S3 described above is “NO”, and the process proceeds to Step S6. In step S <b> 6, it is determined whether or not the third touch sensor 21 has detected a touch operation, that is, whether or not the user is performing a gripping operation for gripping the device housing B with only the right hand. If the gripping operation is not performed, the determination result is “NO”, and the idle state (power off) is maintained.

これに対し、ユーザが右手だけで装置筐体Bを把持する把持動作を行っていると、上記ステップS6の判断結果は「YES」になり、ステップS7に進む。ステップS7では、装置筐体Bの正面側周囲を撮像する第2撮像部14から出力される撮像信号に基づき生成した正面画像に被写体検出を施すよう画像処理部16に指示し、これに応じて画像処理部16が正面画像から被写体を検出したか否かを判断する。   On the other hand, if the user performs a gripping operation for gripping the apparatus housing B with only the right hand, the determination result in step S6 is “YES”, and the process proceeds to step S7. In step S7, the image processing unit 16 is instructed to perform subject detection on the front image generated based on the imaging signal output from the second imaging unit 14 that captures the periphery of the front side of the apparatus housing B, and in response thereto It is determined whether or not the image processing unit 16 has detected a subject from the front image.

被写体を検出していなければ、判断結果は「NO」になり、アイドル状態(電源オフ)を維持するが、被写体を検出すると、判断結果が「YES」となり、この場合には片手(右手)で装置筐体Bを把持して撮影しようとする状態と見なし、電源部12に電源オン(起動状態)を指示すると共に、第3撮像部15による撮影モードの起動を画像処理部16に指示する。この結果、ユーザが右手で装置筐体Bを把持するだけで撮影モードで自動起動される。   If the subject is not detected, the determination result is “NO” and the idle state (power off) is maintained. However, if the subject is detected, the determination result is “YES”. In this case, one hand (right hand) is used. Assuming that the apparatus housing B is to be photographed, the power supply unit 12 is instructed to turn on the power (activated state), and the image processing unit 16 is instructed to activate the imaging mode by the third imaging unit 15. As a result, the user is automatically activated in the shooting mode only by holding the apparatus housing B with the right hand.

以上のように、本実施形態では、第1タッチセンサ19、第2タッチセンサ20および装置筐体Bの背面側周囲を撮像する第1撮像部13を装置筐体Bの背面側に設けると共に、第3タッチセンサ21および筐体Bの正面側周囲を撮像する第2撮像部14を装置筐体Bの正面側に設け、撮像装置100がアイドル状態(電源オフ)になると、第1タッチセンサ19〜第3タッチセンサ21の各タッチ検出結果と、第1撮像部13および第2撮像部14の各撮像出力から得られる周囲状況とに基づき勘案したユーザの挙動(動作)に合致する動作モードで自動的に起動させるので、ユーザの意図に即した自動起動を行うことが可能になる。   As described above, in the present embodiment, the first touch sensor 19, the second touch sensor 20, and the first imaging unit 13 that images the periphery of the back side of the device housing B are provided on the back side of the device housing B. When the imaging device 100 is in an idle state (power off), the first touch sensor 19 is provided when the third touch sensor 21 and the second imaging unit 14 that images the front side periphery of the housing B are provided on the front side of the device housing B. In an operation mode that matches the user's behavior (motion) taken into consideration based on each touch detection result of the third touch sensor 21 and the surrounding situation obtained from each imaging output of the first imaging unit 13 and the second imaging unit 14. Since it is automatically activated, it is possible to perform automatic activation according to the user's intention.

また、本実施形態では、ユーザの挙動に合致する動作モードとして、撮影画像を取得する撮影モードあるいは取得した撮影画像を表示する再生モードの何れかで自動的に起動するので、撮像装置が通常備える撮影モードと再生モードを高い精度で自動起動できる。さらに、取得した背面側周囲の状況からユーザの顔を検出したか否かを判定するので、ユーザが表示画面を見る挙動をしているかどうかを判別することができる。   In the present embodiment, the operation mode that matches the behavior of the user is automatically activated in either a shooting mode for acquiring a captured image or a playback mode for displaying the acquired captured image. The shooting mode and playback mode can be automatically activated with high accuracy. Further, since it is determined whether or not the user's face has been detected from the acquired situation around the back side, it is possible to determine whether or not the user is behaving to view the display screen.

加えて、本実施形態では、背面側周囲の状況からユーザの顔を検出した場合に、その検出した顔の視線の向きを検出するので、ユーザが表示部を見ようとしている挙動を更に高い精度で判別することができる。また、取得した正面側周囲の状況から被写体を検出したか否かを判定するので、ユーザが撮影しようとしている意図を持っている可能性があると判別することができる。さらに、筐体の複数箇所におけるユーザの接触操作(タッチ操作/マルチタッチ操作)を検出し、検出したユーザの接触操作から当該ユーザが筐体を把持する状態を検出するので、ユーザがどのような挙動で筐体を把持しているかを高い精度で判別することができる。   In addition, in the present embodiment, when the user's face is detected from the situation on the back side, the direction of the line of sight of the detected face is detected, so that the behavior of the user looking at the display unit can be further improved. Can be determined. Further, since it is determined whether or not the subject has been detected from the acquired situation around the front side, it can be determined that there is a possibility that the user has an intention to shoot. Furthermore, since the user's contact operation (touch operation / multi-touch operation) at a plurality of locations of the housing is detected, and the state in which the user grips the housing is detected from the detected user contact operation, Whether the housing is gripped by the behavior can be determined with high accuracy.

10 制御部
11 操作部
12 電源部
13 第1撮像部
14 第2撮像部
15 第3撮像部
16 画像処理部
17 表示部
18 記憶部
19 第1タッチセンサ
20 第2タッチセンサ
21 第3タッチセンサ
100 撮像装置
DESCRIPTION OF SYMBOLS 10 Control part 11 Operation part 12 Power supply part 13 1st imaging part 14 2nd imaging part 15 3rd imaging part 16 Image processing part 17 Display part 18 Memory | storage part 19 1st touch sensor 20 2nd touch sensor 21 3rd touch sensor 100 Imaging device

Claims (7)

装置筐体の背面側周囲の状況を取得する第1の状況取得手段と、
装置筐体の正面側周囲の状況を取得する第2の状況取得手段と、
ユーザが装置筐体を把持する状態を検出する把持状態検出手段と、
前記第1の状況取得手段により取得された背面側周囲の状況と、前記第2の状況取得手段により取得された正面側周囲の状況と、前記把持状態検出手段により検出され、ユーザが装置筐体を把持する状態とに基づいて勘案したユーザの挙動に合致する動作モードで自装置を自動的に起動する起動制御手段と
を具備することを特徴とする撮像装置。
First situation acquisition means for acquiring the situation around the back side of the apparatus housing;
Second situation acquisition means for acquiring the situation around the front side of the apparatus housing;
Gripping state detection means for detecting a state in which the user grips the apparatus housing;
The situation around the back side acquired by the first situation acquisition means, the situation around the front side obtained by the second situation acquisition means, and the gripping state detection means, and the user can An image pickup apparatus comprising: an activation control unit that automatically activates the apparatus in an operation mode that matches a user's behavior taken into consideration based on a state of gripping the user.
前記起動制御手段は、ユーザの挙動に合致する動作モードとして、撮影画像を取得する撮影モードあるいは取得した撮影画像を表示する再生モードの何れかで自装置を自動的に起動することを特徴とする請求項1記載の撮像装置。   The activation control means automatically activates its own apparatus in an imaging mode for acquiring a captured image or a playback mode for displaying the acquired captured image as an operation mode that matches a user's behavior. The imaging device according to claim 1. 前記第1の状況取得手段は、取得した背面側周囲の状況からユーザの顔を検出したか否かを判定する顔検出手段を有することを特徴とする請求項1記載の撮像装置。   The imaging apparatus according to claim 1, wherein the first situation acquisition unit includes a face detection unit that determines whether or not a user's face has been detected from the acquired situation around the back side. 前記顔検出手段は、背面側周囲の状況からユーザの顔を検出した場合に、その検出した顔の視線の向きを検出する視線検出手段を更に備えることを特徴とする請求項3記載の撮像装置。   The imaging apparatus according to claim 3, wherein the face detection unit further includes a line-of-sight detection unit that detects a direction of a line of sight of the detected face when a face of the user is detected from a situation around the back side. . 前記第2の状況取得手段は、取得した正面側周囲の状況から被写体を検出したか否かを判定する被写体検出手段を有することを特徴とする請求項1記載の撮像装置。   The imaging apparatus according to claim 1, wherein the second situation acquisition unit includes a subject detection unit that determines whether or not a subject has been detected from the acquired situation around the front side. 前記把持状態検出手段は、装置筐体の複数箇所におけるユーザの接触操作を検出し、検出したユーザの接触操作から当該ユーザが装置筐体を把持する状態を検出することを特徴とする請求項1記載の撮像装置。   2. The gripping state detecting unit detects a user's contact operation at a plurality of locations of the apparatus housing, and detects a state in which the user grips the apparatus housing from the detected user contact operation. The imaging device described. コンピュータに、
装置筐体の背面側周囲の状況を取得する第1の状況取得ステップと、
装置筐体の正面側周囲の状況を取得する第2の状況取得ステップと、
ユーザが装置筐体を把持する状態を検出する把持状態検出ステップと、
前記第1の状況取得ステップにより取得された背面側周囲の状況と、前記第2の状況取得ステップにより取得された正面側周囲の状況と、前記把持状態検出ステップにより検出され、ユーザが装置筐体を把持する状態とに基づいて勘案したユーザの挙動に合致する動作モードで自装置を自動的に起動する起動制御ステップと
を実行させることを特徴とするプログラム。
On the computer,
A first situation acquisition step for obtaining a situation around the back side of the apparatus housing;
A second situation acquisition step for acquiring the situation around the front side of the device housing;
A gripping state detection step of detecting a state in which the user grips the device housing;
The situation around the back side acquired by the first situation acquisition step, the situation around the front side obtained by the second situation acquisition step, and the gripping state detection step, and the user can And a start control step for automatically starting the apparatus in an operation mode that matches the user's behavior taken into account based on the state of gripping the device.
JP2010197240A 2010-09-03 2010-09-03 Imaging apparatus and program Expired - Fee Related JP5633679B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010197240A JP5633679B2 (en) 2010-09-03 2010-09-03 Imaging apparatus and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010197240A JP5633679B2 (en) 2010-09-03 2010-09-03 Imaging apparatus and program

Publications (3)

Publication Number Publication Date
JP2012053366A true JP2012053366A (en) 2012-03-15
JP2012053366A5 JP2012053366A5 (en) 2013-10-03
JP5633679B2 JP5633679B2 (en) 2014-12-03

Family

ID=45906716

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010197240A Expired - Fee Related JP5633679B2 (en) 2010-09-03 2010-09-03 Imaging apparatus and program

Country Status (1)

Country Link
JP (1) JP5633679B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015077004A1 (en) * 2013-11-20 2015-05-28 Google Inc. Systems and methods for performing multi-touch operations on a head-mountable device
JP2016181773A (en) * 2015-03-23 2016-10-13 カシオ計算機株式会社 Display control device, display control method, and program

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07181574A (en) * 1993-12-24 1995-07-21 Nikon Corp Pressure sensitive switch
JPH11160776A (en) * 1997-12-01 1999-06-18 Fuji Photo Film Co Ltd Camera
JP2003319221A (en) * 2002-04-26 2003-11-07 Fuji Photo Film Co Ltd Image pickup device
JP2004341112A (en) * 2003-05-14 2004-12-02 Sony Corp Imaging unit
JP2008017130A (en) * 2006-07-05 2008-01-24 Matsushita Electric Ind Co Ltd Digital camera, program, and recording medium
US20080072172A1 (en) * 2004-03-19 2008-03-20 Michinari Shinohara Electronic apparatus with display unit, information-processing method, and computer product
US20080205866A1 (en) * 2007-01-24 2008-08-28 Shoichiro Sakamoto Image taking apparatus and image taking method
JP2008245105A (en) * 2007-03-28 2008-10-09 Nikon Corp Camera
WO2008126336A1 (en) * 2007-03-30 2008-10-23 Pioneer Corporation Image processing apparatus and method
JP2009111843A (en) * 2007-10-31 2009-05-21 Sony Corp Imaging apparatus and imaging method
JP2009180881A (en) * 2008-01-30 2009-08-13 Nikon Corp Electronic equipment and digital camera
JP2010004217A (en) * 2008-06-19 2010-01-07 Casio Comput Co Ltd Electronic device, operation assistance method, and program
US20100110265A1 (en) * 2008-11-05 2010-05-06 Sony Corporation Imaging apparatus and display control method thereof
US20100194931A1 (en) * 2007-07-23 2010-08-05 Panasonic Corporation Imaging device
US20110206361A1 (en) * 2010-02-22 2011-08-25 Empire Technology Development Llc Image taking device and image taking method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07181574A (en) * 1993-12-24 1995-07-21 Nikon Corp Pressure sensitive switch
JPH11160776A (en) * 1997-12-01 1999-06-18 Fuji Photo Film Co Ltd Camera
JP2003319221A (en) * 2002-04-26 2003-11-07 Fuji Photo Film Co Ltd Image pickup device
JP2004341112A (en) * 2003-05-14 2004-12-02 Sony Corp Imaging unit
US20080072172A1 (en) * 2004-03-19 2008-03-20 Michinari Shinohara Electronic apparatus with display unit, information-processing method, and computer product
JP2008017130A (en) * 2006-07-05 2008-01-24 Matsushita Electric Ind Co Ltd Digital camera, program, and recording medium
US20080205866A1 (en) * 2007-01-24 2008-08-28 Shoichiro Sakamoto Image taking apparatus and image taking method
JP2008245105A (en) * 2007-03-28 2008-10-09 Nikon Corp Camera
WO2008126336A1 (en) * 2007-03-30 2008-10-23 Pioneer Corporation Image processing apparatus and method
US20100194931A1 (en) * 2007-07-23 2010-08-05 Panasonic Corporation Imaging device
JP2009111843A (en) * 2007-10-31 2009-05-21 Sony Corp Imaging apparatus and imaging method
JP2009180881A (en) * 2008-01-30 2009-08-13 Nikon Corp Electronic equipment and digital camera
JP2010004217A (en) * 2008-06-19 2010-01-07 Casio Comput Co Ltd Electronic device, operation assistance method, and program
US20100110265A1 (en) * 2008-11-05 2010-05-06 Sony Corporation Imaging apparatus and display control method thereof
JP2010114569A (en) * 2008-11-05 2010-05-20 Sony Corp Image capturing apparatus, and method of controlling display of the same
US20110206361A1 (en) * 2010-02-22 2011-08-25 Empire Technology Development Llc Image taking device and image taking method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015077004A1 (en) * 2013-11-20 2015-05-28 Google Inc. Systems and methods for performing multi-touch operations on a head-mountable device
US9261700B2 (en) 2013-11-20 2016-02-16 Google Inc. Systems and methods for performing multi-touch operations on a head-mountable device
CN105745568A (en) * 2013-11-20 2016-07-06 谷歌公司 Systems and methods for performing multi-touch operations on a head-mountable device
US9804682B2 (en) 2013-11-20 2017-10-31 Google Inc. Systems and methods for performing multi-touch operations on a head-mountable device
JP2016181773A (en) * 2015-03-23 2016-10-13 カシオ計算機株式会社 Display control device, display control method, and program

Also Published As

Publication number Publication date
JP5633679B2 (en) 2014-12-03

Similar Documents

Publication Publication Date Title
JP4510713B2 (en) Digital camera
JP5652652B2 (en) Display control apparatus and method
KR101624218B1 (en) Digital photographing apparatus and controlling method thereof
JP6331244B2 (en) Image pickup apparatus, image pickup apparatus control method, and computer program.
JP6700708B2 (en) Electronic device, control method thereof, program, and storage medium
JP5956836B2 (en) Imaging apparatus, display control method, and program
KR20170063388A (en) Electronic device and method for controlling the same
JP4628641B2 (en) Digital still camera
JP5940394B2 (en) Imaging device
JP2013179536A (en) Electronic apparatus and control method therefor
JP2005143108A (en) Digital camera performing automatic mode detection
JP2008017130A (en) Digital camera, program, and recording medium
CN113364945A (en) Electronic apparatus, control method, and computer-readable medium
KR101493591B1 (en) Imaging device, imaging processing method and storage medium
JP5633679B2 (en) Imaging apparatus and program
JP2012191570A (en) Photographing device, control method of the same and program
JP2008199107A (en) Digital camera, and method for controlling digital camera
JP5471133B2 (en) Imaging device
JP6123562B2 (en) Imaging device
JP2012124627A (en) Imaging apparatus
JP2008311821A (en) Image input processor, operation processing method, and program
JP2007004660A (en) Equipment
KR101662560B1 (en) Apparatus and Method of Controlling Camera Shutter Executing Function-Configuration and Image-Shooting Simultaneously
JP5344604B2 (en) Imaging apparatus and control method thereof
KR20080036399A (en) Method and device for photographing camera

Legal Events

Date Code Title Description
A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130816

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130816

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140530

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140611

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140729

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140917

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140930

R150 Certificate of patent or registration of utility model

Ref document number: 5633679

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

LAPS Cancellation because of no payment of annual fees