Disclosure of Invention
The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. It should be understood that this summary is not an exhaustive overview of the invention. It is not intended to determine the key or critical elements of the present invention, nor is it intended to limit the scope of the present invention. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is discussed later.
In view of this, the present invention provides a shared viewing apparatus, so as to at least solve the problem that the prior art cannot realize low-cost, real-time viewing and multi-user sharing of viewing at any viewing angle of a scenic spot.
According to an aspect of the present invention, there is provided a shared viewing apparatus, which is disposed on a mobile device capable of viewing a scene or at a fixed position of a viewing area, the shared viewing apparatus including an image/video acquisition unit, a storage unit, a processing unit, and a wireless transmission unit; the processing unit is used for starting to collect image/video data in the visual field of the image/video collecting unit through the image/video collecting unit after the collecting instruction is identified, and storing the collected image/video data in the storage unit; wherein the image/video data comprises a wide-view image sequence of one or more wide-view images; the processing unit is further used for receiving at least one transmission instruction from one or more user equipment through the wireless transmission unit, and for each of the at least one transmission instruction, acquiring the image frame sequence matched with the sending gesture contained in the transmission instruction from the storage unit and sending the image frame sequence to the user equipment sending the transmission instruction.
Further, the shared viewing device further comprises an attitude sensor and a direction correction unit; the attitude sensor is used for acquiring attitude data of the shared viewing device after recognizing the acquisition instruction, wherein the attitude data comprises a viewing direction information sequence corresponding to the wide-viewing-angle image sequence; the direction correction unit is used for performing direction correction on the image/video data by using the attitude data of the shared viewing device, so that the direction information of each pixel position in each wide-angle image in the image/video data is corrected from a relative direction to an absolute direction.
Further, the processing unit acquires the image frame sequence matched with the sending posture contained in the transmission instruction from the storage unit by the following method: judging whether the received transmission instruction contains sending time: when the transmission instruction comprises a sending time, selecting a plurality of frames of wide-angle images with the collection time after the sending time and the sending time from the image/video data stored in the storage unit as candidate images until a new transmission instruction sent again by the user equipment sending the transmission instruction is received, and when the transmission instruction does not comprise the sending time, selecting the last frame of wide-angle image currently stored and a plurality of frames of wide-angle images to be stored later from the image/video data stored in the storage unit as candidate images until a new transmission instruction sent again by the user equipment sending the transmission instruction is received; capturing a screenshot corresponding to the sending gesture in the wide-view image by utilizing the direction information corresponding to each pixel position of the wide-view image aiming at each frame of the selected wide-view image in all the candidate images, wherein the size of the screenshot is a preset value; and forming an image frame sequence in time sequence by utilizing all the intercepted screenshots, wherein the image frame sequence is matched with the sending gesture contained in the transmission instruction.
Further, the shared viewing device further comprises a user management unit; the user management unit is used for receiving the identity authentication request from each user device, authenticating the corresponding user device based on the identity authentication request, and sending authentication success information to the corresponding user device after the authentication is successful so as to establish data connection with the user device.
Further, the image/video acquisition unit comprises a wide-angle camera module.
Further, the processing unit is further configured to: when at least one transmission instruction from one or more user equipment is received, judging whether the transmission instruction contains paid information or not aiming at each transmission instruction, and establishing data connection with the user equipment sending the transmission instruction under the condition that the transmission instruction contains paid information.
Further, the paid information is obtained by means of online payment.
Further, the sending the gesture includes: and the normal direction of the screen of the display module in the corresponding user equipment or the normal direction of the back of the screen.
Further, the mobile device capable of viewing the scenery is: unmanned aerial vehicle, mobile robotic arm, cable car, sightseeing vehicle, train, plane or ship.
The shared viewing device of the invention realizes low-cost and low-risk wide-viewing-angle viewing and multi-user sharing by using the mobile equipment capable of viewing or the wide viewing angle at a specific position, which can not be realized by using traditional equipment such as a telescope, and the shared viewing device provides a wider and more free viewing angle for viewing scenes of common users on site, and has the following beneficial effects:
1) the shared viewing device is arranged at a fixed position of a mobile device or a sightseeing site capable of viewing the scenery, and a user can view the unique scenery at the position and the view angle which can not be reached by the user on the user equipment through the shared viewing device;
2) the user can shoot the scenery at a unique position and a unique visual angle or self-portrait photos and videos by using the shared viewing device;
3) multiple users can share the device to watch scenes or shoot videos and images at the same time;
4) the device can support multiple users to independently select the viewing angle, and the viewing direction of the screen display image of the user equipment is matched with the screen orientation of the user equipment;
when the device is used, a user can obtain image frame sequences with different view angles by changing a sending gesture (such as changing the gesture of a user device, setting different sending gestures through touch screen operation and the like); in this way, for the situation of multiple users, the image frame sequence returned by the shared viewing device to each user is different due to different sending gestures of each user device, so that the images displayed on each user device are different, and the multiple users can independently select viewing angles;
further, after the direction correction is performed by the direction correction unit, the viewing direction corresponding to each frame of the image frame sequence displayed on the display module screen of the user equipment can be made to coincide with the screen back surface normal direction (or screen normal direction) of the display module of the user equipment that issued the transmission instruction. Moreover, the user can switch and select between the normal direction of the back of the screen and the normal direction of the screen, so that the quick switching between the functions of outdoor shooting and self-shooting is realized.
5) The user can pay the rent fee and the framing fee in the form of electronic payment through a mobile terminal such as a mobile phone;
6) the scenic spot or the related operators can share the best viewing experience of the scenic spots to more users by arranging the shared viewing device on mobile equipment which can be used for viewing, such as an unmanned aerial vehicle, a mobile mechanical arm, a cable car, a sightseeing car, a train, an airplane or a ship.
These and other advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings.
Detailed Description
Exemplary embodiments of the present invention will be described hereinafter with reference to the accompanying drawings. In the interest of clarity and conciseness, not all features of an actual implementation are described in the specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the device structures and/or processing steps closely related to the solution according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
The embodiment of the invention provides a shared viewing device, which is arranged on mobile equipment capable of viewing a scene or at a fixed position of a viewing area and comprises an image/video acquisition unit, a storage unit, a processing unit and a wireless transmission unit; the processing unit is used for starting to collect image/video data in the visual field of the image/video collecting unit through the image/video collecting unit after the collecting instruction is identified, and storing the collected image/video data in the storage unit; wherein the image/video data comprises a wide-view image sequence of one or more wide-view images; the processing unit is further used for receiving at least one transmission instruction from one or more user equipment through the wireless transmission unit, and for each of the at least one transmission instruction, acquiring the image frame sequence matched with the sending gesture contained in the transmission instruction from the storage unit and sending the image frame sequence to the user equipment sending the transmission instruction.
An example of the shared viewing apparatus of the present invention is described below with reference to fig. 1. The shared viewing device can be arranged on the mobile equipment capable of viewing the scene. The sightseeing mobile device may be a sightseeing vehicle (such as a cable car, a sightseeing car, a train, an airplane or a ship), or may also be a sightseeing mobile object such as an unmanned plane or a mobile mechanical arm. In addition, the shared viewing apparatus of the present invention may also be located at a fixed location in the viewing area, such as somewhere on the cliff, etc. It should be noted that the vehicle referred to in the present invention is not limited to a manned vehicle, but may be an unmanned vehicle (e.g., a vehicle for transporting goods).
As shown in fig. 1, the shared viewing apparatus includes an image/video capturing unit 1, a storage unit 2, a processing unit 3, and a wireless transmission unit 4.
The processing unit 3, upon recognizing the acquisition instruction, controls the image/video acquisition unit 1 to start acquiring image/video data within its field of view. The image/video data is, for example, video data, or image set data, etc. The processing unit 3 may be, for example, a CPU, a microprocessor, or the like having a control processing function. The wireless transmission unit 4 is, for example, a WIFI transmission module or a bluetooth transmission module.
In one example, the acquisition instruction may come from a control device such as a remote controller, a control key, etc., and the processing unit 3 may receive the acquisition instruction through the wireless transmission unit 4, and the control device may be operated by a scenic staff, for example.
In another example, the processing unit 3 generates a capture instruction in the case of establishing a data connection with the user equipment after receiving a transmission instruction from the user equipment through the wireless transmission unit 4, and then the image/video capture unit 1 captures the image/video data according to the capture instruction.
The image/video capture unit 1 is, for example, a wide-angle camera module. The image/video data includes a wide-angle image sequence composed of one or more wide-angle images, and each pixel position of each wide-angle image in the wide-angle image sequence has direction information (such as relative direction). The relative direction corresponding to each pixel position of the wide-angle image is a direction relative to the overall orientation of the image/video capture unit 1, and is a relative quantity. When the image/video capturing unit 1 is disposed on a mobile device capable of viewing a scene, the orientation of the image/video capturing unit 1 changes with the change of the orientation of the mobile device during traveling or flying, and the relative direction of the pixel positions of the wide-angle image (i.e., the direction with respect to the image/video capturing unit 1) is not changed. The relative direction corresponding to each pixel position of the wide-angle image can be obtained by using the prior art, for example, calculated according to the orientation of the image/video capture device 1-2, the respective orientations of the respective lenses included therein, and the like.
In one implementation (hereinafter referred to as a first implementation), the processing unit 3 stores the image/video data captured by the image/video capturing unit 1 in the storage unit 2. In this implementation, in the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is a relative direction.
In another implementation (hereinafter referred to as a second implementation), when the shared viewing apparatus is provided on a scenery-capable mobile device, the shared viewing apparatus further includes an attitude sensor and a direction correction unit (not shown in the figure). In this way, upon recognizing the above-described capture instruction, the attitude sensor captures the attitude data of the shared viewing apparatus in real time (since the image/video capture unit 1 is provided on the shared viewing apparatus, the attitude data of the shared viewing apparatus corresponds to the attitude data of the image/video capture unit 1).
The attitude data of the shared viewing device includes a viewing direction information sequence corresponding to the wide-angle image sequence acquired by the image/video acquisition unit 1. That is, each frame of wide-angle image captured by the image/video capturing unit 1 has one viewing direction information corresponding thereto in the above-described viewing direction information sequence. Each of the framing direction information included in the framing direction information sequence is used to describe the overall orientation of the image/video capturing unit 1. The image/video capture unit 1 is mounted on the shared viewing apparatus, and it is assumed that the overall orientation of the image/video capture unit 1 is the same as the orientation of the shared viewing apparatus directly in front of the shared viewing apparatus (e.g., directly in front of the mobile device that can be viewed). Thus, when the mobile device capable of viewing the scenery is driven or flying in the south-bound direction, assuming that the image/video capture unit 1 captures a frame of wide-angle image (and stores the frame of wide-angle image into the sequence of wide-angle images), the viewing direction information (captured by the attitude sensor and stored into the sequence of viewing direction information) corresponding to the frame of wide-angle image is in the south-bound direction, and so on.
For example, assume that the wide-angle image sequence acquired by the image/video acquisition unit 1 can be represented by { I }1,I2,I3,……,IMDenotes that M denotes the number of wide-angle images included in the wide-angle image sequence, which is a positive integer; also, it is assumed that a sequence of view direction information (corresponding to the above-described wide-angle image sequence) acquired by the attitude sensor can be used as { D }1,D2,D3,……,DMRepresents it. In this example, the wide view image I1And view direction information D1Correspondence (i.e. scene-taking direction information D)1Indicating that the image/video capturing unit 1 is capturing a wide view angle image I1Global orientation of time), wide view image I2And view direction information D2Correspondence (i.e. scene-taking direction information D)2Indicating that the image/video capturing unit 1 is capturing a wide view angle image I2The overall orientation of the time), and so on.
In this way, the direction correction unit may perform direction correction on the image/video data captured by the image/video capture unit 1 using the attitude data of the shared viewing device so that the direction information of each pixel position in each wide-angle image in the above-described image/video data is corrected from the relative direction to the absolute direction.
With the wide view image I in the above example1For example, before the direction correction, the wide-angle image I1The direction information corresponding to each pixel position in the image is a relative direction. According to the corresponding view direction information D1It can be seen that the image/video acquisition unit 1 is acquiring wide-view images I1The overall orientation of the time and the image I with wide visual angle can be calculated1The absolute direction corresponding to each pixel position in the image. Using wide view images I1The absolute direction corresponding to each pixel position replaces the relative direction of the corresponding pixel position, i.e. the direction correction is completed. The processing of other wide-angle images is similar and will not be described in detail.
Thus, the direction correcting unit saves the corrected image/video data in the storage unit 2. That is, in this implementation (i.e., the second implementation), of the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is an absolute direction.
The absolute direction of each pixel position refers to a direction relative to the earth absolute coordinate system, and each coordinate axis of the earth absolute coordinate system is invariant; while the relative direction of each pixel position is, as described above, a direction with respect to the orientation of the image/video capturing unit 1, the orientation of the image/video capturing unit 1 is changed.
Furthermore, the processing unit 3 may also receive at least one transmission instruction from one or more user equipments via the wireless transmission unit 4. The transmission instruction sent by the user equipment comprises the sending gesture of the user equipment. The sending gesture of the user equipment refers to a gesture of the user equipment when sending the transmission instruction, and may be, for example, a screen normal direction of a display module in the user equipment or a screen back normal direction thereof.
For each of the at least one transmission instruction, the processing unit 3 obtains the image frame sequence matched with the sending posture contained in the transmission instruction from the storage unit 2 to send to the user equipment sending the transmission instruction. The user equipment may be a terminal device such as a smartphone, a tablet computer, or a head-mounted VR (virtual reality) system.
Wherein, when the shared viewing apparatus is provided on a manned vehicle such as a cable car, a sightseeing car, a train, an airplane or a ship, one or more users ride on the vehicle with their respective user devices; when the shared viewing device is arranged on other mobile equipment capable of viewing the view, such as an unmanned aerial vehicle, a mobile mechanical arm and the like, the other mobile equipment capable of viewing the view, such as the unmanned aerial vehicle, the mobile mechanical arm and the like, is positioned at the sightseeing site, and one or more users carrying their respective user equipment are also positioned at the sightseeing site; when the shared viewing apparatus is located at a fixed location, such as a tourist site, one or more users carrying their respective user devices are also located at the tourist site.
According to one implementation, the processing unit 3 may retrieve from the storage unit 2 a sequence of image frames matching the sending pose contained in the transmission instruction by steps 101 to 103 as will be described below.
Step 101 is performed first. In step 101, it is determined whether the received transmission command includes a transmission time: 1) when the transmission instruction includes a sending time, selecting a plurality of frames of wide-angle images with the collection time after the sending time and the sending time from the image/video data stored in the storage unit 2 as candidate images until receiving a new transmission instruction sent again by the user equipment sending the transmission instruction; 2) in the case where the transmission time is not included in the transmission instruction, the last frame of wide view angle image currently stored and the plurality of frames of wide view angle images to be stored later are selected as candidate images in the image/video data stored in the storage unit 2 until a new transmission instruction is received from the user apparatus which issued the transmission instruction again. The sending time refers to a time when the corresponding user equipment sends the transmission instruction.
It should be noted that the "multi-frame wide view image" mentioned in the "multi-frame wide view image whose collection time is selected at the transmission time and after the transmission time among the image/video data stored in the storage unit 2" may be one or more frames of wide view images, or may be 0 frame (for example, in the case where the new transmission instruction is received immediately after a wide view image whose collection time is selected as the transmission time among the image/video data stored in the storage unit 2). Further, the "multi-frame wide angle image to be stored thereafter" referred to in the above "selection of the last frame wide angle image currently stored and the multi-frame wide angle image to be stored thereafter" in the image/video data stored in the storage unit 2 may be one or more frames of wide angle images, and may be 0 frame (for example, in the case where the above-described new transmission instruction is received immediately after the last frame wide angle image currently stored is selected in the image/video data stored in the storage unit 2).
Next, in step 102, for each frame of wide-angle image in all the candidate images selected in step 101, a screenshot whose direction corresponds to the sending gesture in the transmission instruction is captured in the wide-angle image by using the direction information corresponding to each pixel position of the wide-angle image, where the size of the screenshot is a preset value, and the preset value may be set according to an empirical value, or may be set according to a screen size parameter included in the transmission instruction sent by the user equipment, or the like.
Then, in step 103, an image frame sequence is formed in chronological order using all the screen shots captured in step 102 as an image frame sequence matching the transmission pose included in the transmission instruction.
Note that, in the first implementation, in the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is a relative direction. In this case, for a certain wide-angle image, "the screen shot in which the direction corresponds to the transmission posture in the transmission command" captured in step 102 is actually "the screen shot in which the relative direction corresponds to the transmission posture in the transmission command". In this case, the image or video that is ultimately viewed by the user device appears as if the user is sitting on the corresponding mobile device that is viewable, and the image or video that is displayed changes as the direction of travel or flight of the mobile device changes, even if the user device is held in one direction.
Further, in the second implementation, of the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is an absolute direction. In this case, for a certain wide-angle image, "the screen shot in which the direction corresponds to the transmission posture in the transmission command" captured in step 102 is actually "the screen shot in which the absolute direction corresponds to the transmission posture in the transmission command". Therefore, the image or video finally seen by the user equipment is the absolute direction pointed by the normal direction of the back of the screen (or the normal direction of the screen) of the user equipment, and if the user equipment is kept towards one direction, the displayed image or video cannot change along with the change of the driving or flying direction of the mobile equipment with the view. In other words, after the direction correction is performed by the direction correction unit in the second implementation, the viewing direction corresponding to each frame in the image frame sequence displayed on the display module screen of the user equipment coincides with the screen back normal direction (or screen normal direction) of the display module of the user equipment that issued the transmission instruction. The user can switch and select between the normal direction of the back of the screen and the normal direction of the screen, so that the two functions of shooting an external scene and self-shooting are quickly switched. Further, it should be noted that, for one frame image, the viewing direction may refer to the viewing direction of the center position of the image, that is, to which direction the center position is photographed.
According to another implementation, the processing unit 3 may further determine, when receiving at least one transmission instruction from one or more user devices, whether the transmission instruction includes paid information for each of the at least one transmission instruction, and establish a data connection with the user device that issued the transmission instruction if the transmission instruction includes paid information; and if the transmission instruction is judged not to contain the paid information, the data connection is not established with the corresponding user equipment. The paid information is obtained by an online payment method such as code scanning payment, for example.
Furthermore, according to one implementation, the shared viewing apparatus may further include a user management unit. The user management unit may receive an identity authentication request from each user equipment through the wireless transmission unit 4, perform identity authentication on the corresponding user equipment based on the identity authentication request, and send authentication success information to the corresponding user equipment after the authentication is successful so as to establish data connection with the user equipment.
By applying the shared viewing device, a user can realize photographing, shooting or self-shooting at a position where the user cannot reach through user equipment such as a mobile phone, so that the user experience is greatly enhanced, and the shared viewing device can be simultaneously suitable for a plurality of users and is convenient and fast to operate.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.