CN107426506B - A kind of method and terminal of celestial body shooting - Google Patents
A kind of method and terminal of celestial body shooting Download PDFInfo
- Publication number
- CN107426506B CN107426506B CN201710210115.7A CN201710210115A CN107426506B CN 107426506 B CN107426506 B CN 107426506B CN 201710210115 A CN201710210115 A CN 201710210115A CN 107426506 B CN107426506 B CN 107426506B
- Authority
- CN
- China
- Prior art keywords
- celestial
- celestial body
- terminal
- brightness value
- bodies
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- 230000000007 visual effect Effects 0.000 claims description 24
- 238000004364 calculation method Methods 0.000 claims description 7
- 230000015654 memory Effects 0.000 description 31
- 238000004891 communication Methods 0.000 description 26
- 238000010586 diagram Methods 0.000 description 23
- 230000006870 function Effects 0.000 description 15
- 238000004590 computer program Methods 0.000 description 7
- 238000010295 mobile communication Methods 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000007726 management method Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005192 partition Methods 0.000 description 2
- 230000002441 reversible effect Effects 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 1
- 206010034960 Photophobia Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 208000013469 light sensitivity Diseases 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000013468 resource allocation Methods 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/76—Circuitry for compensating brightness variation in the scene by influencing the image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/815—Camera processing pipelines; Components thereof for controlling the resolution by using a single image
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Studio Devices (AREA)
Abstract
The method and terminal of a kind of celestial body shooting provided in an embodiment of the present invention, this method comprises: terminal shoots a frame celestial image;The magnitude for all celestial bodies that can be taken is determined according to current shooting parameter;According to the magnitude of determining celestial body, the corresponding brightness value of each celestial body after the preset time for exposure is calculated;The brightness value that corresponding position in the celestial image is replaced using the corresponding brightness value of each celestial body, obtains final celestial image.In this way, the brightness value of corresponding position while guaranteeing celestial body shooting quality, realizes the quick shooting to celestial body to reduce shooting time in the corresponding brightness value replacement celestial image of each celestial body by using after the preset time for exposure.
Description
Technical Field
The invention relates to the technical field of shooting, in particular to a method and a terminal for shooting celestial bodies.
Background
In the current photography technology field, when a terminal with a shooting function is used for shooting a specific celestial body (such as shooting a silver river), due to the limitation of the brightness of the celestial body, long-time exposure or image superposition processing is required during shooting, the shooting process is influenced by the rotation of the earth, and the angle of the silver river obtained by a camera is always rotating, so that an equatorial telescope, also called a star tracker, is required to be introduced for assisting shooting, and the camera is always aligned to the celestial body to be shot while the earth rotates, so that the angle is kept unchanged, and finally a photo with good light sensitivity and consistent angle is obtained. Considering that the cost of arranging an equatorial telescope in a terminal is high, a method and a terminal for shooting celestial bodies without the equatorial telescope are also provided in the prior art, but whether the equatorial telescope is arranged in the terminal or not, the problem of long waiting time for shooting celestial bodies exists, the exposure time required for shooting starry sky generally ranges from several seconds to several hours, the longer the shooting time is, the more interference factors are, the more shooting failure is likely to be caused, and the more difficult it is for common shooting enthusiasts to catch a scene with stars.
Disclosure of Invention
In order to solve the technical problem, embodiments of the present invention provide a method and a terminal for shooting a celestial body, which achieve quick shooting of a celestial body while ensuring the shooting quality of the celestial body.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
the embodiment of the invention provides a celestial body shooting method, which comprises the following steps:
shooting a frame of celestial body image by the terminal;
determining stars and the like of all celestial bodies which can be shot according to the current shooting parameters;
calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies;
and replacing the brightness value of the corresponding position in the celestial body image by the brightness value corresponding to each celestial body to obtain a final celestial body image.
Preferably, the current photographing parameters include: the local sidereal time of the current geographic position of the terminal and the current posture information of the terminal;
the gesture information includes: the current elevation and azimuth of the terminal.
Preferably, the determining the stars and the like of all the photographed celestial bodies according to the current photographing parameters includes:
and determining all celestial bodies which can be shot by the terminal in a celestial body database according to the local sidereal time and the current posture information of the current geographic position of the terminal, and determining stars and the like of all celestial bodies.
Preferably, at least one visual celestial body is contained in the celestial body image; the brightness value of the visible celestial body in the celestial body image is greater than a preset brightness threshold value;
the calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies comprises the following steps:
acquiring the brightness value of at least one visual celestial body in the celestial body image;
and calculating the brightness value corresponding to each celestial body after the preset exposure time according to the obtained brightness value of at least one visual celestial body, the current exposure parameter of the terminal, the stars and the like of all celestial bodies.
Preferably, the calculating, according to the obtained brightness value of at least one visible celestial body, the current exposure parameter of the terminal, and the stars and the like of all celestial bodies, the brightness value corresponding to each celestial body after the preset exposure time includes:
calculating the brightness value of the visible celestial body after the preset exposure time according to the obtained brightness value of at least one visible celestial body and the current exposure parameter of the terminal;
calculating the brightness values corresponding to other celestial bodies after the preset exposure time according to the brightness values of the visible celestial bodies after the preset exposure time and the stars and the like of all the celestial bodies; the other celestial bodies include: and removing the remaining celestial bodies after the obtained at least one visible celestial body in all determined celestial bodies.
The embodiment of the invention also provides a terminal, which comprises: a shooting module, a determining module, a calculating module and a processing module, wherein,
the shooting module is used for shooting a frame of celestial body image;
the determining module is used for determining stars and the like of all the celestial bodies which can be shot according to the current shooting parameters;
the calculation module is used for calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies;
and the processing module is used for replacing the brightness value of the corresponding position in the celestial body image by utilizing the brightness value corresponding to each celestial body to obtain the final celestial body image.
Preferably, the current photographing parameters include: the local sidereal time of the current geographic position of the terminal and the current posture information of the terminal; the gesture information includes: the current elevation and azimuth of the terminal.
Preferably, the determining module is specifically configured to determine, in a celestial body database, all celestial bodies that can be shot by the terminal according to local sidereal time and current posture information of the current geographic location of the terminal, and determine stars and the like of all celestial bodies.
Preferably, at least one visual celestial body is contained in the celestial body image; the brightness value of the visible celestial body in the celestial body image is greater than a preset brightness threshold value;
the computing module is further used for acquiring the brightness value of at least one visible celestial body in the celestial body image;
correspondingly, the calculating module is specifically configured to calculate, after a preset exposure time, a brightness value corresponding to each celestial body according to the obtained brightness value of at least one visible celestial body, the current exposure parameter of the terminal, and stars and the like of all celestial bodies.
Preferably, the calculating module is specifically configured to calculate, according to the obtained brightness value of at least one visible celestial body and the current exposure parameter of the terminal, a brightness value of the visible celestial body after a preset exposure time; calculating the brightness values corresponding to other celestial bodies after the preset exposure time according to the brightness values of the visible celestial bodies after the preset exposure time and the stars and the like of all the celestial bodies; the other celestial bodies include: and removing the remaining celestial bodies after the obtained at least one visible celestial body in all determined celestial bodies.
In the method and the terminal for shooting the celestial body provided by the embodiment of the invention, the terminal shoots a frame of celestial body image; determining stars and the like of all celestial bodies which can be shot according to the current shooting parameters; calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies; and replacing the brightness value of the corresponding position in the celestial body image by the brightness value corresponding to each celestial body to obtain a final celestial body image. In this way, after the preset exposure time, the brightness value corresponding to each celestial body is used for replacing the brightness value of the corresponding position in the celestial body image, so that the shooting time is reduced, and the celestial body is quickly shot while the shooting quality of the celestial body is ensured.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram of a wireless communication system for the mobile terminal shown in FIG. 1;
FIG. 3 is a flow chart of a first embodiment of a method of celestial body capture of the present invention;
FIG. 4 is a first diagram of a celestial image captured by a terminal according to an embodiment of the present invention;
FIG. 5 is a schematic illustration of a final celestial image obtained in an embodiment of the present invention;
FIG. 6 is a flow chart of a second embodiment of a method of celestial body capture of the present invention;
FIG. 7 is a diagram illustrating a terminal shooting a celestial object according to an embodiment of the present invention;
FIG. 8 is a second diagram of a celestial object image captured by a terminal in an embodiment of the present invention;
fig. 9 is a terminal interaction diagram according to a second embodiment of the present invention;
FIG. 10 is a schematic diagram of a first component structure of a terminal for celestial body photographing according to an embodiment of the present invention;
fig. 11 is a schematic diagram of a second composition structure of the terminal for celestial body photographing according to the embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
A mobile terminal implementing various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
The mobile terminal may be implemented in various forms. For example, the terminal described in the embodiments of the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a Personal Digital Assistant (PDA), a PAD computer (PAD), a Portable Multimedia Player (PMP), a navigation device, and the like, and a fixed terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an audio/video (a/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, and a location information module 115.
The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel may include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to a terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Also, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may exist in the form of an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), an Electronic Service Guide (ESG) of digital video broadcasting-handheld (DVB-H), and the like. The broadcast receiving module 111 may receive a signal broadcast by using various types of broadcasting systems. In particular, the broadcast receiving module 111 may receive digital broadcasting by using a digital broadcasting system such as a data broadcasting system of multimedia broadcasting-terrestrial (DMB-T), digital multimedia broadcasting-satellite (DMB-S), digital video broadcasting-handheld (DVB-H), forward link media (MediaFLO @), terrestrial digital broadcasting integrated service (ISDB-T), and the like. The broadcast receiving module 111 may be constructed to be suitable for various broadcasting systems that provide broadcast signals as well as the above-mentioned digital broadcasting systems. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
The mobile communication module 112 transmits and/or receives radio signals to and/or from at least one of a base station (e.g., access point, node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received according to text and/or multimedia messages.
The wireless internet module 113 supports wireless internet access of the mobile terminal. The module may be internally or externally coupled to the terminal. The wireless internet access technology to which the module relates may include Wireless Local Area Network (WLAN) (Wi-Fi), wireless broadband (Wibro), worldwide interoperability for microwave access (Wimax), High Speed Downlink Packet Access (HSDPA), and the like.
The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of the location information module is a Global Positioning System (GPS). According to the current technology, the location information module 115, which is a GPS, calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information, thereby accurately calculating three-dimensional current location information according to longitude, latitude, and altitude. Currently, a method for calculating position and time information uses three satellites and corrects an error of the calculated position and time information by using another satellite. In addition, the GPS module 115 can calculate speed information by continuously calculating current position information in real time.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, etc. due to being touched), scroll wheel, joystick, etc. In particular, when the touch pad is superimposed on the display unit 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide-type mobile phone, the sensing unit 140 may sense whether the slide-type phone is opened or closed. In addition, the sensing unit 140 can detect whether the power supply unit 190 supplies power or whether the interface unit 170 is coupled with an external device.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a User Identity Module (UIM), a Subscriber Identity Module (SIM), a Universal Subscriber Identity Module (USIM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
The display unit 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 may display a User's Interface (UI) or a Graphical User Interface (GUI) related to a call or other communication (e.g., text messaging, multimedia file downloading, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or an image and related functions, and the like.
Meanwhile, when the display unit 151 and the touch pad are overlapped with each other in the form of a layer to form a touch screen, the display unit 151 may serve as an input device and an output device. The display unit 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to see from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a Transparent Organic Light Emitting Diode (TOLED) display or the like. Depending on the particular desired implementation, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The alarm unit 153 may provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alarm unit 153 may provide output in different ways to notify the occurrence of an event. For example, the alarm unit 153 may provide an output in the form of vibration, and when a call, a message, or some other incoming communication (communicating communication) is received, the alarm unit 153 may provide a tactile output (i.e., vibration) to inform the user thereof. By providing such a tactile output, the user can recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 may also provide an output notifying the occurrence of an event via the display unit 151 or the audio output module 152.
The memory 160 may store software programs or the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, etc.) that has been output or is to be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, and the multimedia module 181 may be constructed within the controller 180 or may be constructed separately from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to now, the mobile terminal has been described in terms of its functions. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
The mobile terminal 100 as shown in fig. 1 may be configured to operate with communication systems such as wired and wireless communication systems and satellite-based communication systems that transmit data via frames or packets.
A communication system in which a mobile terminal according to the present invention is operable will now be described with reference to fig. 2.
Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)), global system for mobile communications (GSM), and the like. By way of non-limiting example, the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
Referring to fig. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of Base Stations (BSs) 270, Base Station Controllers (BSCs) 275, and a Mobile Switching Center (MSC) 280. The MSC280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC280 is also configured to interface with a BSC275, which may be coupled to the base station 270 via a backhaul. The backhaul may be constructed according to any of several known interfaces including, for example, E1/T1, ATM, IP, PPP, frame Relay, HDSL, ADSL, or xDSL. It will be understood that a system as shown in fig. 2 may include multiple BSCs 275.
Each BS270 may serve one or more sectors (or regions), each sector covered by a multi-directional antenna or an antenna pointing in a particular direction being radially distant from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS270 may be configured to support multiple frequency allocations, with each frequency allocation having a particular frequency spectrum (e.g., 1.25MHz,5MHz, etc.).
The intersection of partitions with frequency allocations may be referred to as a CDMA channel. The BS270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" may be used to generically refer to a single BSC275 and at least one BS 270. The base stations may also be referred to as "cells". Alternatively, each sector of a particular BS270 may be referred to as a plurality of cell sites.
As shown in fig. 2, a Broadcast Transmitter (BT)295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in fig. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In fig. 2, several Global Positioning System (GPS) satellites 300 are shown. The satellite 300 assists in locating at least one of the plurality of mobile terminals 100.
In fig. 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information may be obtained with any number of satellites. The GPS module 115 as shown in fig. 1 is generally configured to cooperate with satellites 300 to obtain desired positioning information. Other techniques that can track the location of the mobile terminal may be used instead of or in addition to GPS tracking techniques. In addition, at least one GPS satellite 300 may selectively or additionally process satellite DMB transmission.
As a typical operation of the wireless communication system, the BS270 receives reverse link signals from various mobile terminals 100. The mobile terminal 100 is generally engaged in conversations, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within the particular BS 270. The obtained data is forwarded to the associated BSC 275. The BSC provides call resource allocation and mobility management functions including coordination of soft handoff procedures between BSs 270. The BSCs 275 also route the received data to the MSC280, which provides additional routing services for interfacing with the PSTN 290. Similarly, the PSTN290 interfaces with the MSC280, the MSC interfaces with the BSCs 275, and the BSCs 275 accordingly control the BS270 to transmit forward link signals to the mobile terminal 100.
Based on the above mobile terminal hardware structure and communication system, various embodiments of the present invention are proposed.
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention.
First embodiment
The first embodiment of the invention provides a method for shooting celestial bodies, which can be applied to a terminal with a shooting function.
Here, the terminal described above may be a fixed terminal having a display screen, or may be a mobile terminal having a display screen.
The above-mentioned fixed terminal may be a computer, and the above-mentioned mobile terminal includes but is not limited to a mobile phone, a notebook computer, a camera, a PDA, a PAD, a PMP, a navigation device, and the like. The terminal can be connected to the internet, wherein the connection mode can be through a mobile internet network provided by an operator, and can also be through accessing a wireless access point to perform network connection.
Here, if the mobile terminal has an operating system, the operating system may be UNIX, Linux, Windows, Android (Android), Windows Phone, or the like.
The type, shape, size, and the like of the display screen on the terminal are not limited, and the display screen on the terminal may be a liquid crystal display screen, for example.
Fig. 3 is a flowchart of a first embodiment of the method for celestial body photographing according to the present invention, as shown in fig. 3, the method including:
step 301: the terminal shoots a frame of celestial body image.
Step 302: determining the stars and the like of all celestial bodies which can be shot according to the current shooting parameters.
Optionally, the current shooting parameters may include: the local sidereal time of the current geographic position of the terminal and the current posture information of the terminal; the gesture information includes: the current elevation and azimuth of the terminal.
Here, the local sidereal time of the current geographical location of the terminal is obtained from greenwich local sidereal time of 0h in the world of the current day, the current standard, longitude information of the geographical location of the terminal, and a time difference of a time zone to which the geographical location of the terminal belongs with respect to the 0 time zone.
Specifically, the local sidereal time of the current geographic position of the terminal is calculated through a formula s ═ So + (M-Nh + lambda) + (M-Nh) × mu; wherein,
s is the local sidereal time; so is Greenwich local sidereal time at world time zero of the day; m is the current standard time, namely the time of the time zone to which the geographical position of the terminal belongs; lambda is longitude information of the geographic position of the terminal; nh is the time difference of the time zone to which the geographical position of the terminal belongs relative to the 0 time zone; μ is a conversion factor.
It should be noted that the Greenwich local fixed star time So at the time of day and world time zero can be obtained by searching the astronomical calendar; m and Nh are both obtained directly according to the time zone to which the terminal belongs; the value of the conversion coefficient mu is 1/365.2422.
For example, the current standard, beijing time M ═ 19h 35M; looking up the astronomical calendar, the Greenwich local sidereal time So of 3-month 31-day world time zero is 2h33m52 s; longitude information λ of Nanjing is 7h55m04 s; the local sidereal time s of Nanjing at this time can be calculated by setting the time difference Nh of the time zone to 8h relative to the 0 time zone, setting the conversion coefficient μ to 1/365.2422, and using the formula s to So + (M-Nh + λ) + (M-Nh) × μ, to set the local sidereal time s to 8h05M50s。
In practical implementation, the posture information of the terminal (i.e. the current elevation and azimuth of the terminal) can be obtained through at least one of a compass, a gyroscope and a gravity sensor.
An optional implementation manner in this step is to determine, in the celestial body database, all celestial bodies that can be shot by the terminal according to the local sidereal time and the current posture information of the current geographic location of the terminal, and determine stars and the like of all celestial bodies.
The celestial body database comprises: all celestial body information which can be observed in each region on the earth at least comprises position information of the celestial body, star of the celestial body and the like.
It is astronomically specified that the brightness of stars is uniformly expressed by stars and the like, and the smaller the number of stars, the brighter the star, the brightness difference of 1 star per star is about 2.512 times. The brightness of 1 equi-star is exactly 100 times that of 6 equi-stars. The brightness difference of each 0.1 star etc. is about 1.0965 times. 21 stars are in the sky, 46 stars are in the sky, 134 stars are in the sky, 458 stars are in the sky, 1476 stars are in the sky, 4840 stars are in the sky, and 6974 stars are in the sky. Brighter is 0, etc. to minus stars, etc. For example, the sun is-26.7, etc., the brightness in full moon is-12.6, etc., and the star is brightest and can reach-4.4, etc. We set the darkest star that we can see with the naked eye to be 6 equi-stars (6m stars). The brightness in the sky is more than 6 (namely the number of stars is less than 6), namely more than 6000 stars can be seen. Of course, we can only see stars on half of the celestial sphere at the same time, i.e. more than 3000 stars. The biggest astronomical telescope in the world can see celestial bodies which are dark to 24m, and the Hubble telescope can shoot the darkest stars which are 30 m.
It can be understood that, due to the restriction of the shooting parameters, the brightness of the celestial body and other factors, the celestial body that can be recognized by naked eyes is limited in a frame of celestial body image shot by the terminal currently, and more celestial bodies cannot be displayed in the image. In the embodiment of the invention, celestial bodies which can be identified by naked eyes in a celestial body image (namely, celestial bodies in which the brightness value in the celestial body image is greater than or equal to a preset brightness threshold value) are called visible celestial bodies, celestial bodies which cannot be identified by the naked eyes (namely, celestial bodies in which the brightness value in the celestial body image is less than the preset brightness threshold value) are called hidden celestial bodies, the celestial body image comprises N visible celestial bodies and M hidden celestial bodies, and N and M are both positive integers.
Fig. 4 is a first schematic diagram of a celestial body image shot by a terminal in an embodiment of the present invention, and as shown in fig. 4, a frame of celestial body image shot by the terminal includes visible celestial bodies that can be recognized by naked eyes and hidden celestial bodies that cannot be recognized by the naked eyes, where the visible celestial bodies include: visual celestial body 1, visual celestial body 2 and visual celestial body 3 hide the celestial body and include: hiding celestial body 1 through celestial body M. And when the brightness value of the celestial body is calculated, selecting the brightness value of one of the celestial bodies as a reference visual celestial body, or selecting at least one other visual celestial body for correcting the calculation result after selecting one reference visual celestial body.
Step 303: and calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies.
Here, the luminance value corresponding to each celestial body includes: the brightness values of the N visible celestial bodies and the brightness values of the M hidden celestial bodies.
In practical implementation, at least one visual celestial body is contained in the celestial body image; before calculating the brightness value of the celestial body after the preset exposure time, the brightness value of at least one visual celestial body in the celestial body image needs to be acquired.
Optionally, the brightness value corresponding to each celestial body after the preset exposure time is calculated according to the obtained brightness value of at least one visible celestial body, the current exposure parameter of the terminal, the stars of all celestial bodies, and the like. The exposure time may be set automatically by the terminal or manually by the user, and may be 1 minute, 2 minutes, 10 minutes, 30 minutes, 1 hour, or the like.
Specific embodiments may include: calculating the brightness value of the visible celestial body after the preset exposure time according to the obtained brightness value of at least one visible celestial body and the current exposure parameter of the terminal;
calculating the brightness values corresponding to other celestial bodies after the preset exposure time according to the brightness values of the visible celestial bodies after the preset exposure time and the stars and the like of all the celestial bodies; the other celestial bodies include: and removing the remaining celestial bodies after the obtained at least one visible celestial body in all determined celestial bodies.
Optionally, the brightness value of the visible celestial body after the preset exposure time and the stars and the like of all celestial bodies are calculated according to the formula m2-m1=-2.5lg(E2/E1) Calculating to obtain a brightness value corresponding to each celestial body; m is2Stars and the like for the visible celestial body; e2The brightness value of the visible celestial body after the preset exposure time is obtained; m is1The star of the celestial body to be calculated and the like; e1The brightness value of the celestial body to be calculated.
The above calculation formula can also be converted into:
when the visible celestial body 1 is selected as a reference visible celestial body, luminance values of the visible celestial body 2, the visible celestial body 1, and the hidden celestial bodies 1-M after a preset exposure time are calculated.
Step 304: and replacing the brightness value of the corresponding position in the celestial body image by the brightness value corresponding to each celestial body to obtain a final celestial body image.
It should be noted that, after the preset exposure time, the brightness values of Q hidden celestial bodies are greater than or equal to the brightness threshold, and Q is less than or equal to M. And replacing the brightness value of the corresponding position in the celestial body image by the brightness value of the Q hidden celestial bodies obtained by calculation to obtain a final celestial body image.
It can be understood that not all the hidden celestial bodies can be recognized by naked eyes after the preset exposure time, that is, the brightness values of the obtained celestial bodies still have hidden celestial bodies with brightness values smaller than the brightness threshold, and those hidden celestial bodies with brightness values smaller than the brightness threshold can be ignored and not processed in the replacement process.
Fig. 5 is a schematic diagram of a final celestial body image obtained in the embodiment of the present invention, and as shown in fig. 5, assuming that after a preset exposure time, the luminance values of the obtained hidden celestial bodies 1 to 6 are greater than or equal to a luminance threshold, the luminance values of the corresponding positions are replaced with the luminance values of the obtained hidden celestial bodies 1 to 6, so that the originally invisible hidden celestial bodies 1 to 6 are displayed in the final celestial body image, thereby achieving the purpose of obtaining a celestial body image with the same effect as a long exposure time by using a short exposure time, and reducing the shooting time.
In order to further embody the object of the present invention, the above-mentioned scheme is further exemplified on the basis of the first embodiment of the present invention.
Second embodiment
Fig. 6 is a flowchart of a second embodiment of the method for celestial body photographing according to the present invention, as shown in fig. 6, the flowchart includes:
step 601: an exposure parameter of the terminal and a posture of the terminal are set for a current shooting scene.
The exposure parameters may include shutter speed, aperture size, and sensitivity. For example: in the case of the Nanjing photograph of the Galaxy, the exposure parameters were set to F4.0, shutter speed 2 seconds, and sensitivity ISO 400.
The gestures of the terminal include: current elevation and azimuth.
Step 602: the terminal shoots a frame of celestial body image.
Here, at least one visible celestial body is included in one captured frame of celestial body image.
Furthermore, the terminal can perform noise reduction processing on the non-zero output pixel points in the shooting frame of celestial body image to identify the visible celestial body in the celestial body image.
Step 603: and obtaining the local star time of the current geographic position of the terminal, and the current elevation angle and azimuth angle of the terminal.
The local sidereal time of the current geographic position of the terminal is obtained according to Greenwich local sidereal time of 0h in the world of the current day, the current standard, the longitude information of the geographic position of the terminal and the time difference of the time zone to which the geographic position of the terminal belongs relative to the 0 time zone. Such as: the local sidereal time of Nanjing is 8h05m50 s.
The terminal obtains the current elevation angle and the current azimuth angle through at least one of a compass, a gyroscope and a gravity sensor.
Step 604: and determining all celestial bodies which can be shot by the terminal in the celestial body database according to the local sidereal time, the current elevation angle and the current azimuth angle of the current geographic position of the terminal.
And comparing projection lines on the celestial body database according to the local sidereal time and the current posture information of the current geographic position of the terminal, and determining N visible celestial bodies and M hidden celestial bodies contained in the celestial body image, wherein all celestial bodies can be shot by the terminal.
Fig. 7 is a schematic diagram of shooting celestial bodies by a terminal in an embodiment of the present invention, as shown in fig. 7(a), first, when a local star of a current geographic location of the terminal is located, all celestial bodies that can be shot at the current location of the terminal are determined in a celestial body database, where an angle range that can be shot at the current location is 0 degree to 360 degrees, and therefore, a shooting range of the terminal needs to be determined according to posture information of the terminal. All celestial objects that can be photographed by the terminal can be further determined by the elevation angle and azimuth angle of the terminal, where the elevation angle θ is shown in fig. 7(b)aThe angle formed by the sight line and the horizontal line in the vertical plane of the sight line when the sight line is above the horizontal line; azimuth angle thetaeThe term "horizontal longitude" refers to a horizontal angle between a north-pointing direction line of a certain point and a target direction line along a clockwise direction, such as: the elevation angle is 45 deg., and the azimuth angle is 30 deg.. In fig. 7(c), projection lines are compared on the celestial body database according to the local sidereal time, the current elevation angle and the current azimuth angle of the current geographic location of the terminal, and all celestial bodies that can be shot by the current terminal are determined, including N visible celestial bodies and M hidden celestial bodies.
Step 605: obtaining stars and the like corresponding to all celestial bodies.
Optionally, the celestial body database is queried for stars and the like corresponding to the N visible celestial bodies and the M hidden celestial bodies.
Step 606: and selecting a visual celestial body from the celestial body image as a reference visual celestial body.
In this step, the basis for selecting the reference visible celestial body may be to select the visible celestial body with the largest brightness value from the image of the celestial body as the reference visible celestial body, or select 1 equistar as the reference visible celestial body.
Step 607: and calculating the brightness value of the reference visible celestial body after the preset exposure time according to the exposure parameters of the terminal.
Optionally, the brightness value E of the reference visible celestial body after a preset exposure timetnCan be represented by formula EtnThe method comprises the steps of obtaining E x (tn/t), wherein E is the brightness value of the visual celestial body in the currently shot celestial body image, and tn is the preset exposure timeAnd t is the shutter speed set by the terminal.
Illustratively, after the visual celestial body 1 with the largest brightness value is selected as the reference visual celestial body, the brightness value of the visual celestial body 1 in the celestial body image is read, the brightness value of the visual celestial body 1 in the current celestial body image is 20, and the brightness value of the visual celestial body 1 is 100 after 10 seconds of exposure time due to the current shutter speed being 2 seconds.
Step 608: and calculating the brightness values of other celestial bodies after the preset exposure time.
Optionally, the brightness values of the remaining celestial bodies except the reference visible celestial body in the celestial body image are all calculated by the formula m2-m1=-2.5lgE2/E1Thus obtaining the product.
Alternatively, the brightness value of the remaining visible celestial bodies in the celestial body image except the reference visible celestial body at the preset exposure time is determined by formula EtnObtained as E × (tn/t), the luminance values of the hidden celestial bodies are all given by the formula m2-m1=-2.5lg(E2/E1) Thus obtaining the product.
In practical implementation, if the calculated brightness value of the celestial body is greater than the maximum brightness value that can be displayed by the terminal after the preset exposure time, the brightness value of the celestial body may be set to the optimal brightness value. Alternatively, the optimal brightness value may be the maximum brightness value that the terminal can display, or the brightness value suitable for displaying different celestial bodies, etc.
Step 609: acquiring celestial bodies with brightness values larger than or equal to a brightness threshold value after a preset exposure time.
Step 610: and replacing the brightness value of the corresponding position in the celestial body image by using the acquired brightness value of the celestial body to obtain a final celestial body image.
And when hidden celestial bodies with the brightness values larger than or equal to the brightness threshold value exist after the preset exposure time, replacing the brightness values of the corresponding positions in the celestial body image by the brightness values of the hidden celestial bodies, and displaying the hidden celestial bodies in the celestial body image.
Fig. 8 is a second schematic view of a terminal capturing a celestial object image according to an embodiment of the present invention, and fig. 8(a) is a view of a frame of a celestial object image captured initially by the terminal when the shutter speed of the terminal is set to 1 second, in which only three celestial objects can be recognized by naked eyes; fig. 8(b) is a celestial body image obtained with an exposure time set to 1 minute; fig. 8(c) is a celestial body image obtained with an exposure time set to 10 minutes; fig. 8(d) is an image of a celestial body obtained when the exposure time was set to 1 hour. It can be seen from the figure that the longer the exposure time is, the more hidden celestial bodies are displayed in the celestial body image and are recognized by naked eyes, the better the acquired celestial body image effect is, and the method for shooting the celestial bodies disclosed by the embodiment of the invention can realize the effect of '1 second out of the river', namely, the celestial body image with the same effect as the longer exposure time is acquired by adopting the shorter exposure time.
Optionally, when the user sets the exposure time manually, the celestial body images output at different exposure times can be observed by controlling the sliding progress bar of the exposure time, so that the celestial body images output at different exposure times can be conveniently observed, and the selection of the exposure time is facilitated.
Fig. 9 is a terminal interaction diagram according to a second embodiment of the present invention, and as shown in fig. 9, when a user uses a mobile terminal to shoot a celestial body, the user changes the position of a progress bar by sliding up and down to observe celestial body images output after different exposure times, and selects a celestial body image that meets the needs. Fig. 9(a) is a celestial body image output at a shorter exposure time, and fig. 9(b) is a celestial body image output at a longer exposure time.
In the method for shooting the celestial body provided by the embodiment of the invention, a terminal shoots a frame of celestial body image; determining stars and the like of all celestial bodies which can be shot by the terminal according to the current shooting parameters; calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies; and replacing the brightness value of the corresponding position in the celestial body image by the brightness value corresponding to each celestial body to obtain a final celestial body image. In this way, after the preset exposure time, the brightness value corresponding to each celestial body is used for replacing the brightness value of the corresponding position in the celestial body image, so that the shooting time is reduced, and the celestial body is quickly shot while the shooting quality of the celestial body is ensured.
Third embodiment
Aiming at the method of the embodiment of the invention, the embodiment of the invention also provides a terminal for shooting the celestial body.
Fig. 10 is a schematic diagram of a first composition structure of a terminal for celestial body photographing according to an embodiment of the present invention, as shown in fig. 10, the terminal includes: a photographing module 1001, a determining module 1002, a calculating module 1003 and a processing module 1004; wherein,
a shooting module 1001 for shooting a frame of celestial body image.
A determining module 1002, configured to determine stars and the like of all celestial bodies that can be shot by the terminal according to the current shooting parameters.
The calculating module 1003 is configured to calculate, according to the determined stars and the like of the celestial bodies, a brightness value corresponding to each celestial body after a preset exposure time.
And the processing module 1004 is configured to replace the brightness value at the corresponding position in the celestial body image with the brightness value corresponding to each celestial body to obtain a final celestial body image.
For example, the current photographing parameters may include: the local sidereal time of the current geographic position of the terminal and the current posture information of the terminal; the gesture information includes: the current elevation and azimuth of the terminal.
For example, the determining module 1002 is specifically configured to determine, in a celestial body database, all celestial bodies that can be shot by the terminal according to the local sidereal time and the current posture information of the current geographic location of the terminal, and determine stars and the like of all celestial bodies.
Illustratively, the celestial object image includes at least one visible celestial object; the brightness value of the visible celestial body in the celestial body image is greater than a preset brightness threshold value;
the calculating module 1003 is further configured to obtain a brightness value of at least one visible celestial body in the celestial body image.
Correspondingly, the calculating module 1003 is specifically configured to calculate, according to the obtained brightness value of at least one visible celestial body, the current exposure parameter of the terminal, and stars and the like of all celestial bodies, a brightness value corresponding to each celestial body after a preset exposure time.
Illustratively, the calculating module 1003 is specifically configured to calculate, according to the obtained brightness value of at least one visible celestial body and the current exposure parameter of the terminal, a brightness value of the visible celestial body after a preset exposure time; calculating the brightness values corresponding to other celestial bodies after the preset exposure time according to the brightness values of the visible celestial bodies after the preset exposure time and the stars and the like of all the celestial bodies; the other celestial bodies include: and removing the remaining celestial bodies after the obtained at least one visible celestial body in all determined celestial bodies.
In practical applications, the photographing module 1001, the determining module 1002, the calculating module 1003 and the Processing module 1004 can be implemented by a Central Processing Unit (CPU), a MicroProcessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like in the terminal.
Fourth embodiment
Aiming at the method of the embodiment of the invention, the embodiment of the invention also provides another terminal for shooting the celestial body.
Fig. 11 is a schematic diagram of a second composition structure of a terminal for celestial body shooting according to an embodiment of the present invention, and as shown in fig. 11, the terminal 11 may include: a communication interface 1101, a memory 1102, a processor 1103 and bus 1104, a camera 1105 and a display 1106;
the bus 1104 is used for connecting the communication interface 1101, the processor 1103 and the memory 1102 and the intercommunication among these devices;
the camera 1105 is used for shooting celestial bodies to obtain celestial body image data.
The communication interface 1101 is configured to perform data transmission with an external network element;
the memory 1102 is used for storing instructions and data;
the processor 1103 is configured to process the captured celestial body image. The method is specifically used for: obtaining the local fixed star time of the current geographic position; determining all celestial bodies which can be shot by the terminal according to the information; further obtaining stars and the like corresponding to all celestial bodies; and also for calculating the brightness values of all celestial bodies after a preset exposure time. And replacing the brightness value of the corresponding position in the celestial body image by using the acquired brightness value of the celestial body to obtain a final celestial body image.
The display 1106 is used for displaying the final celestial body image.
In practical applications, the Memory 1102 may be a volatile Memory (volatile Memory), such as a Random-Access Memory (RAM); or a non-volatile Memory (non-volatile Memory) such as a Read-Only Memory (ROM), a flash Memory (flash Memory), a Hard Disk (HDD), or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor 1103.
The processor 1103 may be at least one of an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), an FPGA, a DSP, a CPU, a controller, a microcontroller, and a microprocessor. It is to be understood that the electronic device for implementing the first processor function may be other electronic devices, and the embodiment of the present invention is not limited in particular.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.
Claims (10)
1. A method of celestial body capture, the method comprising:
shooting a frame of celestial body image by the terminal;
determining stars and the like of all celestial bodies which can be shot according to the current shooting parameters;
calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies;
and replacing the brightness value of the corresponding position in the celestial body image by the brightness value corresponding to each celestial body to obtain a final celestial body image.
2. The method of claim 1, wherein the current shooting parameters comprise: the local sidereal time of the current geographic position of the terminal and the current posture information of the terminal;
the gesture information includes: the current elevation and azimuth of the terminal.
3. The method of claim 2, wherein the determining the stars and the like of all the celestial bodies that can be captured according to the current capture parameters comprises:
and determining all celestial bodies which can be shot by the terminal in a celestial body database according to the local sidereal time and the current posture information of the current geographic position of the terminal, and determining stars and the like of all celestial bodies.
4. The method of claim 1, wherein the celestial image comprises at least one visible celestial object; the brightness value of the visible celestial body in the celestial body image is greater than a preset brightness threshold value;
after the calculation of the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies, the calculation comprises the following steps:
acquiring the brightness value of at least one visual celestial body in the celestial body image;
and calculating the brightness value corresponding to each celestial body after the preset exposure time according to the obtained brightness value of at least one visual celestial body, the current exposure parameter of the terminal, the stars and the like of all celestial bodies.
5. The method according to claim 4, wherein the calculating the brightness value corresponding to each celestial body after the preset exposure time according to the obtained brightness value of at least one visual celestial body, the current exposure parameter of the terminal, and the stars and the like of all celestial bodies comprises:
calculating the brightness value of the visible celestial body after the preset exposure time according to the obtained brightness value of at least one visible celestial body and the current exposure parameter of the terminal;
calculating the brightness values corresponding to other celestial bodies after the preset exposure time according to the brightness values of the visible celestial bodies after the preset exposure time and the stars and the like of all the celestial bodies; the other celestial bodies include: and removing the remaining celestial bodies after the obtained at least one visible celestial body in all determined celestial bodies.
6. A terminal, characterized in that the terminal comprises: a shooting module, a determining module, a calculating module and a processing module, wherein,
the shooting module is used for shooting a frame of celestial body image;
the determining module is used for determining stars and the like of all the celestial bodies which can be shot according to the current shooting parameters;
the calculation module is used for calculating the brightness value corresponding to each celestial body after the preset exposure time according to the determined stars and the like of the celestial bodies;
and the processing module is used for replacing the brightness value of the corresponding position in the celestial body image by utilizing the brightness value corresponding to each celestial body to obtain the final celestial body image.
7. The terminal of claim 6, wherein the current shooting parameters comprise: the local sidereal time of the current geographic position of the terminal and the current posture information of the terminal; the gesture information includes: the current elevation and azimuth of the terminal.
8. The terminal according to claim 7, wherein the determining module is specifically configured to determine, in the celestial body database, all celestial bodies that can be captured by the terminal according to the local sidereal time and the current pose information of the current geographic location of the terminal, and determine stars and the like of all celestial bodies.
9. The terminal of claim 6, wherein the celestial image includes at least one visible celestial object; the brightness value of the visible celestial body in the celestial body image is greater than a preset brightness threshold value;
the computing module is further used for acquiring the brightness value of at least one visible celestial body in the celestial body image;
correspondingly, the calculating module is specifically configured to calculate, after a preset exposure time, a brightness value corresponding to each celestial body according to the obtained brightness value of at least one visible celestial body, the current exposure parameter of the terminal, and stars and the like of all celestial bodies.
10. The terminal according to claim 9, wherein the calculating module is specifically configured to calculate, according to the obtained brightness value of the at least one visible celestial body and a current exposure parameter of the terminal, a brightness value of the visible celestial body after a preset exposure time; calculating the brightness values corresponding to other celestial bodies after the preset exposure time according to the brightness values of the visible celestial bodies after the preset exposure time and the stars and the like of all the celestial bodies; the other celestial bodies include: and removing the remaining celestial bodies after the obtained at least one visible celestial body in all determined celestial bodies.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710210115.7A CN107426506B (en) | 2017-03-31 | 2017-03-31 | A kind of method and terminal of celestial body shooting |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710210115.7A CN107426506B (en) | 2017-03-31 | 2017-03-31 | A kind of method and terminal of celestial body shooting |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107426506A CN107426506A (en) | 2017-12-01 |
CN107426506B true CN107426506B (en) | 2019-10-29 |
Family
ID=60423150
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710210115.7A Active CN107426506B (en) | 2017-03-31 | 2017-03-31 | A kind of method and terminal of celestial body shooting |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107426506B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3783863B2 (en) * | 2002-10-22 | 2006-06-07 | 三菱電機株式会社 | Astronomical detector |
CN102538757A (en) * | 2011-11-08 | 2012-07-04 | 昆明理工大学 | Method for observing bright source and surrounding dark source targets |
CN104092948A (en) * | 2014-07-29 | 2014-10-08 | 小米科技有限责任公司 | Method and device for processing image |
CN104134225A (en) * | 2014-08-06 | 2014-11-05 | 深圳市中兴移动通信有限公司 | Picture synthetic method and device |
CN105141825A (en) * | 2015-06-23 | 2015-12-09 | 努比亚技术有限公司 | Heavenly body shooting method and device |
CN105516611A (en) * | 2014-10-08 | 2016-04-20 | 奥林巴斯株式会社 | An imaging device and a shooting method |
CN105554370A (en) * | 2014-10-22 | 2016-05-04 | 佳能株式会社 | Image processing apparatus and image processing method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5751014B2 (en) * | 2010-05-28 | 2015-07-22 | リコーイメージング株式会社 | Astronomical auto tracking imaging method and astronomical auto tracking imaging device |
-
2017
- 2017-03-31 CN CN201710210115.7A patent/CN107426506B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3783863B2 (en) * | 2002-10-22 | 2006-06-07 | 三菱電機株式会社 | Astronomical detector |
CN102538757A (en) * | 2011-11-08 | 2012-07-04 | 昆明理工大学 | Method for observing bright source and surrounding dark source targets |
CN104092948A (en) * | 2014-07-29 | 2014-10-08 | 小米科技有限责任公司 | Method and device for processing image |
CN104134225A (en) * | 2014-08-06 | 2014-11-05 | 深圳市中兴移动通信有限公司 | Picture synthetic method and device |
CN105516611A (en) * | 2014-10-08 | 2016-04-20 | 奥林巴斯株式会社 | An imaging device and a shooting method |
CN105554370A (en) * | 2014-10-22 | 2016-05-04 | 佳能株式会社 | Image processing apparatus and image processing method |
CN105141825A (en) * | 2015-06-23 | 2015-12-09 | 努比亚技术有限公司 | Heavenly body shooting method and device |
Also Published As
Publication number | Publication date |
---|---|
CN107426506A (en) | 2017-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106454121B (en) | Double-camera shooting method and device | |
US8780258B2 (en) | Mobile terminal and method for generating an out-of-focus image | |
CN106909274B (en) | Image display method and device | |
WO2018019124A1 (en) | Image processing method and electronic device and storage medium | |
CN106131450B (en) | Image processing method and device and terminal | |
CN105163042B (en) | A kind of apparatus and method for blurring processing depth image | |
CN105739099B (en) | Virtual reality device, display equipment and image adjusting method | |
CN106713716B (en) | Shooting control method and device for double cameras | |
CN106097284B (en) | A kind of processing method and mobile terminal of night scene image | |
CN106406536A (en) | Head device, display device and image display method | |
CN106954020B (en) | A kind of image processing method and terminal | |
CN106911881B (en) | Dynamic photo shooting device and method based on double cameras and terminal | |
CN106851114B (en) | Photo display device, photo generation device, photo display method, photo generation method and terminal | |
CN106973226B (en) | Shooting method and terminal | |
CN106993134B (en) | Image generation device and method and terminal | |
CN106373110A (en) | Method and device for image fusion | |
CN105338244B (en) | A kind of information processing method and mobile terminal | |
CN106657783A (en) | Image shooting device and method | |
CN105554285B (en) | Processing method for taking person photo and intelligent mobile terminal | |
CN106791449B (en) | Photo shooting method and device | |
CN106254783B (en) | Moving object shooting method and device | |
CN105898158B (en) | A kind of data processing method and electronic equipment | |
CN109165487B (en) | Face unlocking method, mobile terminal and computer readable storage medium | |
CN106709882A (en) | Image fusion method and device | |
CN106780408B (en) | Picture processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |