Moving object shooting method and device
Technical Field
The invention relates to the technical field of camera shooting, in particular to a moving object shooting method and a moving object shooting device.
Background
A conventional photographing moving object is a live view photographing through an extremely short shutter. However, the shutter time is too short, and the amount of light entering is often insufficient, and the photographed picture is dark or unclear. In order to solve the problem of insufficient light input, the screen brightness is generally increased by increasing the sensitivity (ISO), and increasing the ISO causes new problems such as increased noise.
In the prior art, the common technology mostly balances the relationship between the shutter time and the ISO, meets the shooting requirement of medium and low speed objects, and achieves the purposes that the moving object is not blackened too much, the amplitude is slightly fuzzy, and the noise point can be in the receiving range.
Therefore, it is necessary to provide a photographing method and apparatus for a moving object.
Disclosure of Invention
The invention mainly aims to provide a moving object shooting method and a moving object shooting device, and aims to solve the problem of virtual focus caused by incapability of focusing when a moving object is shot in the prior art.
In order to achieve the above object, the present invention provides a method for shooting a moving object, which is applied to a mobile terminal having a camera, and comprises the steps of: calculating the movement speed of the moving object; calculating the position coordinates of the moving object according to the moving speed; judging whether the position coordinate exceeds the visible range of the camera or not; if not, the moving object is subjected to motion tracking and shot.
Optionally, the motion tracking and shooting the moving object includes: calculating the shutter time; acquiring light sensitivity; and tracking the motion track of the moving object and shooting the moving object.
Optionally, the calculating the shutter time includes: acquiring a residual shadow length threshold; calculating a time threshold according to the motion speed and the ghost length threshold; taking the shutter time to be a maximum value smaller than the time threshold.
Optionally, the acquiring sensitivity comprises: calculating sensitivity corresponding to the shutter time; or, the corresponding sensitivity is searched according to the current environment golden brightness requirement and a preset data table.
Optionally, the calculating the motion speed of the moving object includes: respectively acquiring a first position coordinate of the moving object at a first time and a second position coordinate of the moving object at a second time; and calculating the movement speed of the moving object according to the absolute value of the difference value between the first position coordinate and the second position coordinate and the time difference value between the second time and the first time.
In addition, to achieve the above object, the present invention further provides a moving object photographing device applied to a mobile terminal having a camera, the device including: the speed calculation module is used for calculating the movement speed of the moving object; the calculation module is used for calculating the position coordinate of the moving object according to the movement speed; the judging module is used for judging whether the position coordinate exceeds the visible range of the camera or not; and the shooting module is used for tracking the motion of the moving object and shooting when the judging module judges that the position coordinate does not exceed the visible range of the camera.
Optionally, the photographing module includes: a shutter time calculation unit for calculating a shutter time; a sensitivity acquisition unit configured to acquire sensitivity; and the shooting unit is used for tracking the motion track of the moving object and shooting the moving object.
Optionally, the shutter time calculation unit includes: the residual image length threshold value acquisition unit is used for acquiring a residual image length threshold value; the time threshold calculation unit is used for calculating a time threshold according to the motion speed and the ghost length threshold; and the value taking unit is used for taking the shutter time as the maximum value smaller than the time threshold.
Optionally, the sensitivity acquiring unit is specifically configured to calculate a sensitivity corresponding to the shutter time; or, the corresponding sensitivity is searched according to the current environment golden brightness requirement and a preset data table.
Optionally, the speed calculation module includes: the position acquisition unit is used for respectively acquiring a first position coordinate of the moving object at a first time and a second position coordinate of the moving object at a second time; and the calculating unit is used for calculating the movement speed of the moving object according to the absolute value of the difference value between the first position coordinate and the second position coordinate and the time difference value between the second time and the first time.
The moving object shooting method and the moving object shooting device calculate the moving speed of the moving object, calculate the position coordinate of the moving object according to the moving speed, and track and shoot the moving object when the position coordinate is judged not to exceed the visual range of the camera. By establishing a motion model of the object, the motion trail of the object is judged in advance, and the model is utilized to track the operation, so that the problems of dark picture, more noise points, virtual focus and the like caused by insufficient light input quantity are avoided, and a superior tracking shooting effect is realized.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
fig. 3 is a flowchart illustrating a moving object photographing method according to a first embodiment of the present invention;
fig. 4 is a schematic sub-flow chart of a moving object photographing method according to a first embodiment of the present invention;
FIG. 5 is a schematic diagram of coordinates of a moving object at different positions;
fig. 6 is a flowchart illustrating a moving object photographing method according to a second embodiment of the present invention;
fig. 7 is a flowchart illustrating a moving object photographing method according to a third embodiment of the present invention;
fig. 8 is a block diagram schematically illustrating a moving object photographing device according to a fourth embodiment of the present invention;
FIG. 9 is a block diagram of a speed calculation module of FIG. 8;
fig. 10 is a block diagram schematically illustrating a moving object photographing device according to a fifth embodiment of the present invention;
fig. 11 is a block diagram of a moving object photographing device according to a sixth embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive a broadcast signal by using a signal such as multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital videoBroadcast-handheld (DVB-H), forward link media (MediaFLO)@) A digital broadcasting system of a terrestrial digital broadcasting integrated service (ISDB-T), etc. receives digital broadcasting. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include WLAN (wireless LAN) (Wi-Fi), Wibro (wireless broadband), Wimax (worldwide interoperability for microwave access), HSDPA (high speed downlink packet access), and the like.
The short-range communication module 114 is a module for supporting short-range communication. Some examples of short-range communication technologies include bluetoothTMRadio Frequency Identification (RFID), infrared data association (IrDA), Ultra Wideband (UWB), zigbeeTMAnd so on.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a GPS (global positioning system). According to the current technology, the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device.
The interface unit 170 serves as a connection through which at least one external device can be connected to the mobile terminal 100
And (4) a mouth. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User Interface (UI) or a graphic user interface (gui) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.)
(GUI). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (incomingmunication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 1810 for reproducing (or playing back) multimedia data, and the multimedia module 1810 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 2750.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal hardware structure and communication system, the present invention provides various embodiments of the method.
Example one
As shown in fig. 3, a first embodiment of the present invention proposes a moving object photographing method applied to a mobile terminal having a camera, the method including:
in step 310, the motion speed of the moving object is calculated.
Specifically, referring to fig. 4, step 310 further includes:
step 410, a first position coordinate of the moving object at a first time and a second position coordinate of the moving object at a second time are respectively obtained.
Specifically, when the object is in motion, the object is sampled twice at a first time and a second time, respectively, and when the object is sampled twice, the object is located at the first position at the first time and located at the second position at the second time.
Step 420, calculating the moving speed of the moving object according to the absolute value of the difference between the first position coordinate and the second position coordinate and the time difference between the second time and the first time.
Specifically, the coordinates of the object at the first position and the second position are calculated respectively, and then the absolute value of the difference between the coordinates of the first position and the coordinates of the second position is calculated, and the time difference between the second time and the first time is calculated.
The speed of motion of the object is determined using the formula: the velocity (v) is calculated as displacement(s) ÷ time (t).
And step 320, calculating the position coordinates of the moving object according to the movement speed.
Specifically, according to the movement speed of the object, the formula is used: the displacement(s) is a velocity (v) × time (t), and the position coordinates of the object at the next movement time (the next time after the second time, for example, the third time) are estimated.
Further, if the position coordinate of the predicted object at the next movement time is not accurate enough, after the actual position of the object is changed, real-time updating and correction are carried out, the data of the position of the predicted object at the next movement time are covered forward, and self-calibration is carried out to ensure the correctness of the calculated data. The pre-estimation calculation of the information such as the distance and the speed of the subsequent position can be conveniently and timely carried out according to the existing information such as the position.
Step 330, determining whether the position coordinate exceeds the visible range of the camera. If yes, go to step 340; if not, go to step 350.
Specifically, in a preferred embodiment, the mobile terminal has a camera, and determines whether the estimated position coordinate exceeds the visible range of the camera, if yes, the process goes to step 340; if not, go to step 350.
In another preferred embodiment, the mobile terminal has two cameras, including a first camera and a second camera adjacent to the first camera, and determines whether the calculated position coordinate exceeds the visible range of the first camera and the second camera, if yes, the process goes to step 340; if not, go to step 350.
And step 340, canceling the motion tracking of the moving object.
And step 350, performing motion tracking on the moving object and shooting.
In order to make the moving object photographing method of the present embodiment more clearly understood, the following example is used:
the first step is as follows: and calculating the movement speed of the moving object.
In this example, the mobile terminal has a camera a and a camera B adjacent to the camera a.
Fig. 5 is a schematic diagram showing coordinates of a moving object at different positions.
In fig. 5, points a and B represent the positions of camera a and camera B, respectively; o represents the center of the line segment AB; point C represents a moving object; the angle alpha represents the included angle between the line segment AC and the line segment AB; the angle beta represents the included angle between the line segment BC and the line segment BA; the angle γ represents the angle between the line segment OC and the line segment OB. Wherein the length of the line segment AB is known data. From the length of the line segment AB, and with the dual camera distancing, α and β can be found.
Under the condition that AB and alpha and beta are known, the lengths and gamma angles of the line segment AC, the line segment BC and the line segment OC can be calculated by applying mathematical formulas.
Specifically, OD ═ OC × sin γ; CD ═ OC × cos γ.
Taking the two-dimensional coordinates as an example:
in two-dimensional coordinates (as shown in fig. 5), point O is a circular point, line AB is an X axis, and line OD is a Y axis.
When the object is in motion, the object C is sampled twice at times t1 and t2, respectively, with a sampling interval Δ t1 being t2-t1 (typically the time difference between two frames). At the time of two samplings, the positions of the object C are C1 and C2.
When the object is located at the position C1, the length of the line segment OD1 and the length of the line segment C1D1 are calculated.
When the object is located at the position C2, the length of the line segment OD2 and the length of the line segment C2D2 are calculated.
Thus, the Y-axis difference Δ OD between C1 and C2 is calculated as OD1-OD2, and the X-axis difference Δ CD between C1D1-C2D 2.
Using the motion equation, the motion velocity Vy of the object motion in the Y-axis direction is obtained as Δ OD Δ t1, and the motion velocity in the X-axis direction is obtained as Vx Δ CD Δ t 1.
Taking three-dimensional coordinates as an example:
if the three-dimensional coordinate is adopted, the three-dimensional Z axis can be directly inferred on the basis of the two-dimensional coordinate according to the same principle, and Vz can be calculated.
The second step is that: and calculating the position coordinates of the moving object according to the movement speed.
Spatial position coordinates of the object at different times: c1(C1D1, OD1, Z1), C2(C2D2, OD2, Z2), and estimates the position coordinates of the object at time t 3: c3(C2D2+ Vx ×. Δ t2, OD2+ Vy ×. Δ t2, Z2+ Vz ×. Δ t2), where Δ t2 ═ t3-t2, C2D2+ Vx ×. Δ t2 is the focal length of the new location.
The third step: and judging whether the position coordinate of the C3 exceeds the visible range of the camera.
And calculating the angle of the C3 relative to the point O according to the position coordinates of the C3 to judge whether the visible range of the camera A and the camera B is exceeded, stopping motion tracking if the visible range of the camera A and the visible range of the camera B are exceeded, tracking the motion track of the object if the visible range of the camera A and the visible range of the camera B are not exceeded, and shooting the object.
The moving object shooting method of the embodiment calculates the moving speed of the moving object, calculates the position coordinate of the moving object according to the moving speed, and tracks and shoots the moving object when judging that the position coordinate does not exceed the visual range of the camera. By establishing a motion model of the object, the motion trail of the object is judged in advance, and the model is utilized to track the operation, so that the problems of dark picture, more noise points, virtual focus and the like caused by insufficient light input quantity are avoided, and a superior tracking shooting effect is realized.
Example two
Referring to fig. 6, a second embodiment of the invention further provides a method for shooting a moving object. In the second embodiment, the moving object photographing method is a further improvement made on the basis of the first embodiment, except that: the motion tracking and shooting of the moving object in the first embodiment specifically includes:
in step 610, a shutter time is calculated.
In step 620, sensitivity is obtained.
Specifically, when the shutter time is determined, the sensitivity (ISO) corresponding to the shutter time is calculated using an algorithm according to the shutter time, that is, satisfying a sufficient light-intake amount.
Further, the corresponding sensitivity can also be searched according to the current environment golden brightness requirement and a preset data table.
And step 630, tracking the motion track of the moving object and shooting the moving object.
According to the moving object shooting method, the shutter time is calculated, the light sensitivity is acquired, the motion track of the moving object is tracked, and the moving object is shot, so that the problems of dark picture, more noise points, virtual focus and the like caused by insufficient light input quantity are effectively solved, and the shooting effect of the moving object is improved.
EXAMPLE III
Referring to fig. 7, a method for photographing a moving object is further provided in the third embodiment of the present invention. In the third embodiment, the moving object photographing method is a further improvement made on the basis of the second embodiment, except that: the calculating of the shutter time in the second embodiment specifically includes:
step 710, a ghost length threshold is obtained.
Specifically, the ghost length threshold is related to the focal length, i.e.: the larger the focal length, the larger the ghost length threshold.
And step 720, calculating a time threshold according to the motion speed and the afterimage length threshold.
Specifically, using the formula: the time threshold (T) is calculated as an afterimage length threshold (W)/a movement speed (V).
Step 730, dereferencing the shutter time to a maximum value smaller than the time threshold.
In this embodiment, by determining whether there is a length threshold W of the afterimage, when the object movement distance in the shutter time exceeds W, it is determined that the afterimage will be left, and further according to the speed of the object relative to the X axis and the Z axis, the value of the shutter time t is: the shutter time T is less than a time threshold T.
Preferably, the shutter time is taken to be the maximum value that satisfies this condition.
According to the moving object shooting method, the residual image length threshold is obtained, the time threshold is calculated according to the moving speed and the residual image length threshold, and the value of the shutter time is the maximum value smaller than the time threshold, so that reasonable shutter time is obtained, reasonable light sensitivity is obtained, and the shooting effect of the moving object is particularly guaranteed.
Example four
The invention further provides a moving object photographing device.
Referring to fig. 8, fig. 8 is a block diagram of a moving object photographing device according to a fourth embodiment of the present invention.
The embodiment provides a moving object shooting device, is applied to mobile terminal, and this mobile terminal has the camera, the device includes:
and the speed calculation module 810 is used for calculating the movement speed of the moving object.
Specifically, referring to fig. 9, the speed calculating module 810 further includes:
a position obtaining unit 910, configured to obtain a first position coordinate at a first time and a second position coordinate at a second time of the moving object, respectively.
Specifically, when the object is in motion, the position obtaining unit 910 performs two sampling on the object at a first time and a second time, respectively, where the object is located at the first position at the first time and at the second position at the second time.
A calculating unit 920, configured to calculate a moving speed of the moving object according to an absolute value of a difference between the first position coordinate and the second position coordinate and a time difference between the second time and the first time.
Specifically, the calculation unit 920 calculates the coordinates of the object at the first position and the second position, respectively, and further calculates the absolute value of the difference between the first position coordinate and the second position coordinate, and calculates the time difference between the second time and the first time.
The speed of motion of the object is determined using the formula: the velocity (v) is calculated as displacement(s) ÷ time (t).
And an estimation module 820, configured to estimate the position coordinate of the moving object according to the motion speed.
Specifically, based on the speed of motion of the object, the estimation module 820 uses the formula: the displacement(s) is a velocity (v) × time (t), and the position coordinates of the object at the next movement time (the next time after the second time, for example, the third time) are estimated.
Further, if the position coordinate of the predicted object at the next movement time is not accurate enough, after the actual position of the object is changed, real-time updating and correction are carried out, the data of the position of the predicted object at the next movement time are covered forward, and self-calibration is carried out to ensure the correctness of the calculated data. The pre-estimation calculation of the information such as the distance and the speed of the subsequent position can be conveniently and timely carried out according to the existing information such as the position.
The determining module 830 is configured to determine whether the position coordinate exceeds a visible range of the camera. If so, canceling the motion tracking of the moving object; if not, the photographing module 840 is triggered.
Specifically, in a preferred embodiment, the mobile terminal has a camera, the determining module 830 determines whether the calculated position coordinate exceeds the visible range of the camera, and if so, cancels the motion tracking of the moving object; if not, the photographing module 840 is triggered.
In another preferred embodiment, the mobile terminal has two cameras, including a first camera and a second camera adjacent to the first camera, the determining module 830 determines whether the calculated position coordinate exceeds the visible range of the first camera and the second camera, and if so, cancels the motion tracking of the moving object; if not, the photographing module 840 is triggered.
And the shooting module 840 is used for performing motion tracking on the moving object and shooting the moving object.
In order to make the moving object photographing method of the present embodiment more clearly understood, the following example is used:
the first step is as follows: the velocity calculation module 810 calculates the movement velocity of the moving object.
In this example, the mobile terminal has a camera a and a camera B adjacent to the camera a.
Fig. 5 is a schematic diagram showing coordinates of a moving object at different positions.
In fig. 5, points a and B represent the positions of camera a and camera B, respectively; o represents the center of the line segment AB; point C represents a moving object; the angle alpha represents the included angle between the line segment AC and the line segment AB; the angle beta represents the included angle between the line segment BC and the line segment BA; the angle γ represents the angle between the line segment OC and the line segment OB. Wherein the length of the line segment AB is known data. From the length of the line segment AB, and with the dual camera distancing, α and β can be found.
Under the condition that AB and alpha and beta are known, the lengths and gamma angles of the line segment AC, the line segment BC and the line segment OC can be calculated by applying mathematical formulas.
Specifically, OD ═ OC × sin γ; CD ═ OC × cos γ.
Taking the two-dimensional coordinates as an example:
in two-dimensional coordinates (as shown in fig. 5), point O is a circular point, line AB is an X axis, and line OD is a Y axis.
When the object is in motion, the object C is sampled twice at times t1 and t2, respectively, with a sampling interval Δ t1 being t2-t1 (typically the time difference between two frames). At the time of two samplings, the positions of the object C are C1 and C2.
When the object is located at the position C1, the length of the line segment OD1 and the length of the line segment C1D1 are calculated.
When the object is located at the position C2, the length of the line segment OD2 and the length of the line segment C2D2 are calculated.
Thus, the Y-axis difference Δ OD between C1 and C2 is calculated as OD1-OD2, and the X-axis difference Δ CD between C1D1-C2D 2.
Using the motion equation, the motion velocity Vy of the object motion in the Y-axis direction is obtained as Δ OD Δ t1, and the motion velocity in the X-axis direction is obtained as Vx Δ CD Δ t 1.
Taking three-dimensional coordinates as an example:
if the three-dimensional coordinate is adopted, the three-dimensional Z axis can be directly inferred on the basis of the two-dimensional coordinate according to the same principle, and Vz can be calculated.
The second step is that: the calculation module 820 calculates the position coordinates of the moving object according to the movement speed.
Spatial position coordinates of the object at different times: c1(C1D1, OD1, Z1), C2(C2D2, OD2, Z2), and estimates the position coordinates of the object at time t 3: c3(C2D2+ Vx ×. Δ t2, OD2+ Vy ×. Δ t2, Z2+ Vz ×. Δ t2), where Δ t2 ═ t3-t2, C2D2+ Vx ×. Δ t2 is the focal length of the new location.
The third step: the determination module 830 determines whether the position coordinate of C3 exceeds the visible range of the camera.
And calculating the angle of the C3 relative to the point O according to the position coordinates of the C3 to judge whether the visual range of the camera A and the camera B is exceeded, stopping motion tracking if the visual range of the camera A and the visual range of the camera B are exceeded, and triggering the shooting module 840 to track the motion track of the object and shoot the object if the visual range of the camera A and the visual range of the camera B are not exceeded.
The moving object photographing device of the embodiment calculates the moving speed of the moving object through the speed calculation module 810, and calculates the position coordinates of the moving object according to the moving speed through the calculation module 820, and when the determination module 830 determines that the position coordinates do not exceed the visible range of the camera, the photographing module 840 performs motion tracking and photographing on the moving object. By establishing a motion model of the object, the motion trail of the object is judged in advance, and the model is utilized to track the operation, so that the problems of dark picture, more noise points, virtual focus and the like caused by insufficient light input quantity are avoided, and a superior tracking shooting effect is realized.
EXAMPLE five
Referring to fig. 10, a moving object photographing device is further provided in the fifth embodiment of the present invention. In the fifth embodiment, the moving object photographing device is a further improvement made on the basis of the fourth embodiment, except that: the shooting module specifically comprises:
a shutter time calculation unit 1010 for calculating a shutter time.
A sensitivity acquiring unit 1020 for acquiring sensitivity.
Specifically, when the shutter time is determined, the sensitivity acquisition unit 1020 calculates the sensitivity (ISO) corresponding to the shutter time by using an algorithm according to the shutter time, that is, satisfying a sufficient light-intake amount.
Further, the sensitivity obtaining unit 1020 may also look up the corresponding sensitivity according to the current ambient golden brightness requirement and a preset data table.
And a shooting unit 1030, configured to track a motion trajectory of a moving object and shoot the moving object.
The moving object shooting device of the embodiment calculates the shutter time through the shutter time calculation unit 1010, obtains the light sensitivity through the light sensitivity obtaining unit 1020, tracks the motion track of the moving object through the shooting unit 1030, and shoots the moving object, so that the problems of dark picture, more noise, virtual focus and the like caused by insufficient light input quantity are effectively avoided, and the shooting effect of the moving object is improved.
EXAMPLE six
Referring to fig. 11, a moving object photographing device is further provided in the sixth embodiment of the present invention. In the sixth embodiment, the moving object photographing device is a further improvement made on the basis of the fifth embodiment except that: the shutter time calculation unit specifically includes:
an afterimage length threshold obtaining unit 1110 is configured to obtain an afterimage length threshold.
Specifically, the ghost length threshold is related to the focal length, i.e.: the larger the focal length, the larger the ghost length threshold.
A time threshold calculation unit 1120, configured to calculate a time threshold according to the motion speed and the afterimage length threshold.
Specifically, the time threshold calculation unit 1120 uses the formula: the time threshold (T) is calculated as an afterimage length threshold (W)/a movement speed (V).
A value unit 1130 is configured to take the shutter time to a maximum value smaller than the time threshold.
In this embodiment, by determining whether there is a length threshold W of the afterimage, when the object movement distance in the shutter time exceeds W, it is determined that the afterimage will be left, and further according to the speed of the object relative to the X axis and the Z axis, the value of the shutter time t is: the shutter time T is less than a time threshold T.
Preferably, the value unit 1130 takes the shutter time as the maximum value that satisfies the condition.
In the moving object photographing device of this embodiment, the afterimage length threshold value is obtained by the afterimage length threshold value obtaining unit 1110, the time threshold value calculating unit obtains a reasonable shutter time according to the motion speed, the afterimage length threshold value and the calculated time threshold value, and the value taking unit 1130 takes the shutter time as the maximum value smaller than the time threshold value, so as to obtain a reasonable light sensitivity, and particularly, the photographing effect of the moving object is ensured.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.