[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN106657792B - Shared viewing device - Google Patents

Shared viewing device Download PDF

Info

Publication number
CN106657792B
CN106657792B CN201710017244.4A CN201710017244A CN106657792B CN 106657792 B CN106657792 B CN 106657792B CN 201710017244 A CN201710017244 A CN 201710017244A CN 106657792 B CN106657792 B CN 106657792B
Authority
CN
China
Prior art keywords
image
transmission instruction
unit
wide
sending
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710017244.4A
Other languages
Chinese (zh)
Other versions
CN106657792A (en
Inventor
朱磊
韩琦
李建英
杨晓光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Original Assignee
Harbin Yishe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Yishe Technology Co Ltd filed Critical Harbin Yishe Technology Co Ltd
Priority to CN201710017244.4A priority Critical patent/CN106657792B/en
Publication of CN106657792A publication Critical patent/CN106657792A/en
Application granted granted Critical
Publication of CN106657792B publication Critical patent/CN106657792B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides a shared viewing device. The device supports multiple users to independently select the viewing angle, and can be arranged on mobile equipment capable of viewing the scene or at fixed positions of scenic spots. The device comprises an image/video acquisition unit, a storage unit, a processing unit and a wireless transmission unit; after the acquisition instruction is identified, the image/video acquisition unit acquires image/video data in the visual field of the image/video acquisition unit and stores the image/video data in the storage unit; wherein the image/video data comprises a wide view image sequence of one or more wide view images; the processing unit receives at least one transmission instruction from one or more user equipment through the wireless transmission unit, and for each transmission instruction, acquires the image frame sequence matched with the sending gesture contained in the transmission instruction from the storage unit and sends the image frame sequence to the user equipment sending the transmission instruction. The invention can provide a free visual angle viewing experience facing to the sightseeing scene for the user.

Description

Shared viewing device
Technical Field
The invention relates to an information processing technology, in particular to a shared viewing device.
Background
With the development of tourism economy, the tourism industry is developed vigorously, wherein the natural landscape is the most important part. The aesthetic feeling of natural scenery is often dependent on the angle and time of viewing, and many beautiful scenery needs to be seen at extremely special positions and angles, so-called 'infinite scene at danger peak', unfortunately, due to the cost and safety considerations, most scenic spots have viewing patterns following established routes, and users have no opportunity to draw natural scenery from the optimal viewing angle, which is unfortunate. On the other hand, if the problem that the user can enjoy scenic spots and scenery at any position and at any visual angle is solved, the sightseeing experience of the user is certainly and greatly improved.
The existing means for expanding the sightseeing visual angle/visual field are mainly expected to be distant mirrors, cable cars, hot air balloons, pictures, documentaries (including 3D documentaries), Virtual Reality (VR) and the like, and no means is available for visitors to achieve the sightseeing experience of the free visual angle facing the sightseeing site with extremely low cost.
Disclosure of Invention
The following presents a simplified summary of the invention in order to provide a basic understanding of some aspects of the invention. It should be understood that this summary is not an exhaustive overview of the invention. It is not intended to determine the key or critical elements of the present invention, nor is it intended to limit the scope of the present invention. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is discussed later.
In view of this, the present invention provides a shared viewing apparatus, so as to at least solve the problem that the prior art cannot realize low-cost, real-time viewing and multi-user sharing of viewing at any viewing angle of a scenic spot.
According to an aspect of the present invention, there is provided a shared viewing apparatus, which is disposed on a mobile device capable of viewing a scene or at a fixed position of a viewing area, the shared viewing apparatus including an image/video acquisition unit, a storage unit, a processing unit, and a wireless transmission unit; the processing unit is used for starting to collect image/video data in the visual field of the image/video collecting unit through the image/video collecting unit after the collecting instruction is identified, and storing the collected image/video data in the storage unit; wherein the image/video data comprises a wide-view image sequence of one or more wide-view images; the processing unit is further used for receiving at least one transmission instruction from one or more user equipment through the wireless transmission unit, and for each of the at least one transmission instruction, acquiring the image frame sequence matched with the sending gesture contained in the transmission instruction from the storage unit and sending the image frame sequence to the user equipment sending the transmission instruction.
Further, the shared viewing device further comprises an attitude sensor and a direction correction unit; the attitude sensor is used for acquiring attitude data of the shared viewing device after recognizing the acquisition instruction, wherein the attitude data comprises a viewing direction information sequence corresponding to the wide-viewing-angle image sequence; the direction correction unit is used for performing direction correction on the image/video data by using the attitude data of the shared viewing device, so that the direction information of each pixel position in each wide-angle image in the image/video data is corrected from a relative direction to an absolute direction.
Further, the processing unit acquires the image frame sequence matched with the sending posture contained in the transmission instruction from the storage unit by the following method: judging whether the received transmission instruction contains sending time: when the transmission instruction comprises a sending time, selecting a plurality of frames of wide-angle images with the collection time after the sending time and the sending time from the image/video data stored in the storage unit as candidate images until a new transmission instruction sent again by the user equipment sending the transmission instruction is received, and when the transmission instruction does not comprise the sending time, selecting the last frame of wide-angle image currently stored and a plurality of frames of wide-angle images to be stored later from the image/video data stored in the storage unit as candidate images until a new transmission instruction sent again by the user equipment sending the transmission instruction is received; capturing a screenshot corresponding to the sending gesture in the wide-view image by utilizing the direction information corresponding to each pixel position of the wide-view image aiming at each frame of the selected wide-view image in all the candidate images, wherein the size of the screenshot is a preset value; and forming an image frame sequence in time sequence by utilizing all the intercepted screenshots, wherein the image frame sequence is matched with the sending gesture contained in the transmission instruction.
Further, the shared viewing device further comprises a user management unit; the user management unit is used for receiving the identity authentication request from each user device, authenticating the corresponding user device based on the identity authentication request, and sending authentication success information to the corresponding user device after the authentication is successful so as to establish data connection with the user device.
Further, the image/video acquisition unit comprises a wide-angle camera module.
Further, the processing unit is further configured to: when at least one transmission instruction from one or more user equipment is received, judging whether the transmission instruction contains paid information or not aiming at each transmission instruction, and establishing data connection with the user equipment sending the transmission instruction under the condition that the transmission instruction contains paid information.
Further, the paid information is obtained by means of online payment.
Further, the sending the gesture includes: and the normal direction of the screen of the display module in the corresponding user equipment or the normal direction of the back of the screen.
Further, the mobile device capable of viewing the scenery is: unmanned aerial vehicle, mobile robotic arm, cable car, sightseeing vehicle, train, plane or ship.
The shared viewing device of the invention realizes low-cost and low-risk wide-viewing-angle viewing and multi-user sharing by using the mobile equipment capable of viewing or the wide viewing angle at a specific position, which can not be realized by using traditional equipment such as a telescope, and the shared viewing device provides a wider and more free viewing angle for viewing scenes of common users on site, and has the following beneficial effects:
1) the shared viewing device is arranged at a fixed position of a mobile device or a sightseeing site capable of viewing the scenery, and a user can view the unique scenery at the position and the view angle which can not be reached by the user on the user equipment through the shared viewing device;
2) the user can shoot the scenery at a unique position and a unique visual angle or self-portrait photos and videos by using the shared viewing device;
3) multiple users can share the device to watch scenes or shoot videos and images at the same time;
4) the device can support multiple users to independently select the viewing angle, and the viewing direction of the screen display image of the user equipment is matched with the screen orientation of the user equipment;
when the device is used, a user can obtain image frame sequences with different view angles by changing a sending gesture (such as changing the gesture of a user device, setting different sending gestures through touch screen operation and the like); in this way, for the situation of multiple users, the image frame sequence returned by the shared viewing device to each user is different due to different sending gestures of each user device, so that the images displayed on each user device are different, and the multiple users can independently select viewing angles;
further, after the direction correction is performed by the direction correction unit, the viewing direction corresponding to each frame of the image frame sequence displayed on the display module screen of the user equipment can be made to coincide with the screen back surface normal direction (or screen normal direction) of the display module of the user equipment that issued the transmission instruction. Moreover, the user can switch and select between the normal direction of the back of the screen and the normal direction of the screen, so that the quick switching between the functions of outdoor shooting and self-shooting is realized.
5) The user can pay the rent fee and the framing fee in the form of electronic payment through a mobile terminal such as a mobile phone;
6) the scenic spot or the related operators can share the best viewing experience of the scenic spots to more users by arranging the shared viewing device on mobile equipment which can be used for viewing, such as an unmanned aerial vehicle, a mobile mechanical arm, a cable car, a sightseeing car, a train, an airplane or a ship.
These and other advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments of the present invention, taken in conjunction with the accompanying drawings.
Drawings
The invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like reference numerals are used throughout the figures to indicate like or similar parts. The accompanying drawings, which are incorporated in and form a part of this specification, illustrate preferred embodiments of the present invention and, together with the detailed description, serve to further explain the principles and advantages of the invention. In the drawings:
fig. 1 is a block diagram schematically showing the configuration of one example of the shared viewing apparatus of the present invention.
Skilled artisans appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve the understanding of the embodiments of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described hereinafter with reference to the accompanying drawings. In the interest of clarity and conciseness, not all features of an actual implementation are described in the specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the device structures and/or processing steps closely related to the solution according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
The embodiment of the invention provides a shared viewing device, which is arranged on mobile equipment capable of viewing a scene or at a fixed position of a viewing area and comprises an image/video acquisition unit, a storage unit, a processing unit and a wireless transmission unit; the processing unit is used for starting to collect image/video data in the visual field of the image/video collecting unit through the image/video collecting unit after the collecting instruction is identified, and storing the collected image/video data in the storage unit; wherein the image/video data comprises a wide-view image sequence of one or more wide-view images; the processing unit is further used for receiving at least one transmission instruction from one or more user equipment through the wireless transmission unit, and for each of the at least one transmission instruction, acquiring the image frame sequence matched with the sending gesture contained in the transmission instruction from the storage unit and sending the image frame sequence to the user equipment sending the transmission instruction.
An example of the shared viewing apparatus of the present invention is described below with reference to fig. 1. The shared viewing device can be arranged on the mobile equipment capable of viewing the scene. The sightseeing mobile device may be a sightseeing vehicle (such as a cable car, a sightseeing car, a train, an airplane or a ship), or may also be a sightseeing mobile object such as an unmanned plane or a mobile mechanical arm. In addition, the shared viewing apparatus of the present invention may also be located at a fixed location in the viewing area, such as somewhere on the cliff, etc. It should be noted that the vehicle referred to in the present invention is not limited to a manned vehicle, but may be an unmanned vehicle (e.g., a vehicle for transporting goods).
As shown in fig. 1, the shared viewing apparatus includes an image/video capturing unit 1, a storage unit 2, a processing unit 3, and a wireless transmission unit 4.
The processing unit 3, upon recognizing the acquisition instruction, controls the image/video acquisition unit 1 to start acquiring image/video data within its field of view. The image/video data is, for example, video data, or image set data, etc. The processing unit 3 may be, for example, a CPU, a microprocessor, or the like having a control processing function. The wireless transmission unit 4 is, for example, a WIFI transmission module or a bluetooth transmission module.
In one example, the acquisition instruction may come from a control device such as a remote controller, a control key, etc., and the processing unit 3 may receive the acquisition instruction through the wireless transmission unit 4, and the control device may be operated by a scenic staff, for example.
In another example, the processing unit 3 generates a capture instruction in the case of establishing a data connection with the user equipment after receiving a transmission instruction from the user equipment through the wireless transmission unit 4, and then the image/video capture unit 1 captures the image/video data according to the capture instruction.
The image/video capture unit 1 is, for example, a wide-angle camera module. The image/video data includes a wide-angle image sequence composed of one or more wide-angle images, and each pixel position of each wide-angle image in the wide-angle image sequence has direction information (such as relative direction). The relative direction corresponding to each pixel position of the wide-angle image is a direction relative to the overall orientation of the image/video capture unit 1, and is a relative quantity. When the image/video capturing unit 1 is disposed on a mobile device capable of viewing a scene, the orientation of the image/video capturing unit 1 changes with the change of the orientation of the mobile device during traveling or flying, and the relative direction of the pixel positions of the wide-angle image (i.e., the direction with respect to the image/video capturing unit 1) is not changed. The relative direction corresponding to each pixel position of the wide-angle image can be obtained by using the prior art, for example, calculated according to the orientation of the image/video capture device 1-2, the respective orientations of the respective lenses included therein, and the like.
In one implementation (hereinafter referred to as a first implementation), the processing unit 3 stores the image/video data captured by the image/video capturing unit 1 in the storage unit 2. In this implementation, in the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is a relative direction.
In another implementation (hereinafter referred to as a second implementation), when the shared viewing apparatus is provided on a scenery-capable mobile device, the shared viewing apparatus further includes an attitude sensor and a direction correction unit (not shown in the figure). In this way, upon recognizing the above-described capture instruction, the attitude sensor captures the attitude data of the shared viewing apparatus in real time (since the image/video capture unit 1 is provided on the shared viewing apparatus, the attitude data of the shared viewing apparatus corresponds to the attitude data of the image/video capture unit 1).
The attitude data of the shared viewing device includes a viewing direction information sequence corresponding to the wide-angle image sequence acquired by the image/video acquisition unit 1. That is, each frame of wide-angle image captured by the image/video capturing unit 1 has one viewing direction information corresponding thereto in the above-described viewing direction information sequence. Each of the framing direction information included in the framing direction information sequence is used to describe the overall orientation of the image/video capturing unit 1. The image/video capture unit 1 is mounted on the shared viewing apparatus, and it is assumed that the overall orientation of the image/video capture unit 1 is the same as the orientation of the shared viewing apparatus directly in front of the shared viewing apparatus (e.g., directly in front of the mobile device that can be viewed). Thus, when the mobile device capable of viewing the scenery is driven or flying in the south-bound direction, assuming that the image/video capture unit 1 captures a frame of wide-angle image (and stores the frame of wide-angle image into the sequence of wide-angle images), the viewing direction information (captured by the attitude sensor and stored into the sequence of viewing direction information) corresponding to the frame of wide-angle image is in the south-bound direction, and so on.
For example, assume that the wide-angle image sequence acquired by the image/video acquisition unit 1 can be represented by { I }1,I2,I3,……,IMDenotes that M denotes the number of wide-angle images included in the wide-angle image sequence, which is a positive integer; also, it is assumed that a sequence of view direction information (corresponding to the above-described wide-angle image sequence) acquired by the attitude sensor can be used as { D }1,D2,D3,……,DMRepresents it. In this example, the wide view image I1And view direction information D1Correspondence (i.e. scene-taking direction information D)1Indicating that the image/video capturing unit 1 is capturing a wide view angle image I1Global orientation of time), wide view image I2And view direction information D2Correspondence (i.e. scene-taking direction information D)2Indicating that the image/video capturing unit 1 is capturing a wide view angle image I2The overall orientation of the time), and so on.
In this way, the direction correction unit may perform direction correction on the image/video data captured by the image/video capture unit 1 using the attitude data of the shared viewing device so that the direction information of each pixel position in each wide-angle image in the above-described image/video data is corrected from the relative direction to the absolute direction.
With the wide view image I in the above example1For example, before the direction correction, the wide-angle image I1The direction information corresponding to each pixel position in the image is a relative direction. According to the corresponding view direction information D1It can be seen that the image/video acquisition unit 1 is acquiring wide-view images I1The overall orientation of the time and the image I with wide visual angle can be calculated1The absolute direction corresponding to each pixel position in the image. Using wide view images I1The absolute direction corresponding to each pixel position replaces the relative direction of the corresponding pixel position, i.e. the direction correction is completed. The processing of other wide-angle images is similar and will not be described in detail.
Thus, the direction correcting unit saves the corrected image/video data in the storage unit 2. That is, in this implementation (i.e., the second implementation), of the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is an absolute direction.
The absolute direction of each pixel position refers to a direction relative to the earth absolute coordinate system, and each coordinate axis of the earth absolute coordinate system is invariant; while the relative direction of each pixel position is, as described above, a direction with respect to the orientation of the image/video capturing unit 1, the orientation of the image/video capturing unit 1 is changed.
Furthermore, the processing unit 3 may also receive at least one transmission instruction from one or more user equipments via the wireless transmission unit 4. The transmission instruction sent by the user equipment comprises the sending gesture of the user equipment. The sending gesture of the user equipment refers to a gesture of the user equipment when sending the transmission instruction, and may be, for example, a screen normal direction of a display module in the user equipment or a screen back normal direction thereof.
For each of the at least one transmission instruction, the processing unit 3 obtains the image frame sequence matched with the sending posture contained in the transmission instruction from the storage unit 2 to send to the user equipment sending the transmission instruction. The user equipment may be a terminal device such as a smartphone, a tablet computer, or a head-mounted VR (virtual reality) system.
Wherein, when the shared viewing apparatus is provided on a manned vehicle such as a cable car, a sightseeing car, a train, an airplane or a ship, one or more users ride on the vehicle with their respective user devices; when the shared viewing device is arranged on other mobile equipment capable of viewing the view, such as an unmanned aerial vehicle, a mobile mechanical arm and the like, the other mobile equipment capable of viewing the view, such as the unmanned aerial vehicle, the mobile mechanical arm and the like, is positioned at the sightseeing site, and one or more users carrying their respective user equipment are also positioned at the sightseeing site; when the shared viewing apparatus is located at a fixed location, such as a tourist site, one or more users carrying their respective user devices are also located at the tourist site.
According to one implementation, the processing unit 3 may retrieve from the storage unit 2 a sequence of image frames matching the sending pose contained in the transmission instruction by steps 101 to 103 as will be described below.
Step 101 is performed first. In step 101, it is determined whether the received transmission command includes a transmission time: 1) when the transmission instruction includes a sending time, selecting a plurality of frames of wide-angle images with the collection time after the sending time and the sending time from the image/video data stored in the storage unit 2 as candidate images until receiving a new transmission instruction sent again by the user equipment sending the transmission instruction; 2) in the case where the transmission time is not included in the transmission instruction, the last frame of wide view angle image currently stored and the plurality of frames of wide view angle images to be stored later are selected as candidate images in the image/video data stored in the storage unit 2 until a new transmission instruction is received from the user apparatus which issued the transmission instruction again. The sending time refers to a time when the corresponding user equipment sends the transmission instruction.
It should be noted that the "multi-frame wide view image" mentioned in the "multi-frame wide view image whose collection time is selected at the transmission time and after the transmission time among the image/video data stored in the storage unit 2" may be one or more frames of wide view images, or may be 0 frame (for example, in the case where the new transmission instruction is received immediately after a wide view image whose collection time is selected as the transmission time among the image/video data stored in the storage unit 2). Further, the "multi-frame wide angle image to be stored thereafter" referred to in the above "selection of the last frame wide angle image currently stored and the multi-frame wide angle image to be stored thereafter" in the image/video data stored in the storage unit 2 may be one or more frames of wide angle images, and may be 0 frame (for example, in the case where the above-described new transmission instruction is received immediately after the last frame wide angle image currently stored is selected in the image/video data stored in the storage unit 2).
Next, in step 102, for each frame of wide-angle image in all the candidate images selected in step 101, a screenshot whose direction corresponds to the sending gesture in the transmission instruction is captured in the wide-angle image by using the direction information corresponding to each pixel position of the wide-angle image, where the size of the screenshot is a preset value, and the preset value may be set according to an empirical value, or may be set according to a screen size parameter included in the transmission instruction sent by the user equipment, or the like.
Then, in step 103, an image frame sequence is formed in chronological order using all the screen shots captured in step 102 as an image frame sequence matching the transmission pose included in the transmission instruction.
Note that, in the first implementation, in the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is a relative direction. In this case, for a certain wide-angle image, "the screen shot in which the direction corresponds to the transmission posture in the transmission command" captured in step 102 is actually "the screen shot in which the relative direction corresponds to the transmission posture in the transmission command". In this case, the image or video that is ultimately viewed by the user device appears as if the user is sitting on the corresponding mobile device that is viewable, and the image or video that is displayed changes as the direction of travel or flight of the mobile device changes, even if the user device is held in one direction.
Further, in the second implementation, of the image/video data stored in the storage unit 2, the direction information corresponding to each pixel position in each wide-angle image is an absolute direction. In this case, for a certain wide-angle image, "the screen shot in which the direction corresponds to the transmission posture in the transmission command" captured in step 102 is actually "the screen shot in which the absolute direction corresponds to the transmission posture in the transmission command". Therefore, the image or video finally seen by the user equipment is the absolute direction pointed by the normal direction of the back of the screen (or the normal direction of the screen) of the user equipment, and if the user equipment is kept towards one direction, the displayed image or video cannot change along with the change of the driving or flying direction of the mobile equipment with the view. In other words, after the direction correction is performed by the direction correction unit in the second implementation, the viewing direction corresponding to each frame in the image frame sequence displayed on the display module screen of the user equipment coincides with the screen back normal direction (or screen normal direction) of the display module of the user equipment that issued the transmission instruction. The user can switch and select between the normal direction of the back of the screen and the normal direction of the screen, so that the two functions of shooting an external scene and self-shooting are quickly switched. Further, it should be noted that, for one frame image, the viewing direction may refer to the viewing direction of the center position of the image, that is, to which direction the center position is photographed.
According to another implementation, the processing unit 3 may further determine, when receiving at least one transmission instruction from one or more user devices, whether the transmission instruction includes paid information for each of the at least one transmission instruction, and establish a data connection with the user device that issued the transmission instruction if the transmission instruction includes paid information; and if the transmission instruction is judged not to contain the paid information, the data connection is not established with the corresponding user equipment. The paid information is obtained by an online payment method such as code scanning payment, for example.
Furthermore, according to one implementation, the shared viewing apparatus may further include a user management unit. The user management unit may receive an identity authentication request from each user equipment through the wireless transmission unit 4, perform identity authentication on the corresponding user equipment based on the identity authentication request, and send authentication success information to the corresponding user equipment after the authentication is successful so as to establish data connection with the user equipment.
By applying the shared viewing device, a user can realize photographing, shooting or self-shooting at a position where the user cannot reach through user equipment such as a mobile phone, so that the user experience is greatly enhanced, and the shared viewing device can be simultaneously suitable for a plurality of users and is convenient and fast to operate.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (6)

1. The shared viewing device is arranged on mobile equipment capable of viewing or at a fixed position of a viewing area and comprises an image/video acquisition unit, a storage unit, a processing unit and a wireless transmission unit;
the processing unit is used for starting to collect image/video data in the visual field of the image/video collecting unit through the image/video collecting unit after the collecting instruction is identified, and storing the collected image/video data in the storage unit; wherein the image/video data comprises a wide-view image sequence of one or more wide-view images;
the processing unit is further used for receiving at least one transmission instruction from one or more user equipment through the wireless transmission unit, and for each transmission instruction, acquiring an image frame sequence matched with a sending gesture contained in the transmission instruction from the storage unit to send the image frame sequence to the user equipment sending the transmission instruction;
the shared viewing device also comprises an attitude sensor and a direction correction unit; the attitude sensor is used for acquiring attitude data of the shared viewing device after recognizing the acquisition instruction, wherein the attitude data comprises a viewing direction information sequence corresponding to the wide-viewing-angle image sequence; the direction correction unit is used for performing direction correction on the image/video data by using attitude data of the shared viewing device so as to correct direction information of each pixel position in each wide-view-angle image in the image/video data from a relative direction to an absolute direction;
the sending gesture comprises a screen normal direction or a screen back normal direction of a display module in the corresponding user equipment;
the processing unit acquires the image frame sequence matched with the sending gesture contained in the transmission instruction from the storage unit in the following way:
judging whether the received transmission instruction contains sending time:
when the transmission instruction includes a sending time, selecting a plurality of frames of wide-angle images with the collection time after the sending time and the sending time from the image/video data stored in the storage unit as candidate images until receiving a new transmission instruction sent again by the user equipment sending the transmission instruction,
in the case that the transmission instruction does not include the transmission time, selecting the last frame of wide view angle image currently stored and a plurality of frames of wide view angle images to be stored later as candidate images in the image/video data stored in the storage unit until receiving a new transmission instruction sent again by the user equipment sending the transmission instruction;
capturing a screenshot corresponding to the sending gesture in the wide-view image by utilizing the direction information corresponding to each pixel position of the wide-view image aiming at each frame of the selected wide-view image in all the candidate images, wherein the size of the screenshot is a preset value;
and forming an image frame sequence in time sequence by utilizing all the intercepted screenshots, wherein the image frame sequence is matched with the sending gesture contained in the transmission instruction.
2. The shared viewing apparatus of claim 1, further comprising a user management unit;
the user management unit is used for receiving the identity authentication request from each user device, authenticating the corresponding user device based on the identity authentication request, and sending authentication success information to the corresponding user device after the authentication is successful so as to establish data connection with the user device.
3. The shared viewing device of claim 1 or 2, wherein the image/video capture unit comprises a wide-angle camera module.
4. The shared viewing apparatus of claim 1 or 2, wherein the processing unit is further configured to:
when the at least one transmission instruction from the one or more user equipment is received, judging whether the transmission instruction contains paid information or not aiming at each of the at least one transmission instruction, and establishing data connection with the user equipment sending the transmission instruction under the condition that the transmission instruction contains paid information.
5. The shared viewing device of claim 4, wherein the paid information is obtained by online payment.
6. A shared viewing apparatus as claimed in claim 1 or 2, wherein the mobile device capable of viewing is:
unmanned aerial vehicle, mobile robotic arm, cable car, sightseeing vehicle, train, plane or ship.
CN201710017244.4A 2017-01-10 2017-01-10 Shared viewing device Active CN106657792B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710017244.4A CN106657792B (en) 2017-01-10 2017-01-10 Shared viewing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710017244.4A CN106657792B (en) 2017-01-10 2017-01-10 Shared viewing device

Publications (2)

Publication Number Publication Date
CN106657792A CN106657792A (en) 2017-05-10
CN106657792B true CN106657792B (en) 2020-02-18

Family

ID=58843862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710017244.4A Active CN106657792B (en) 2017-01-10 2017-01-10 Shared viewing device

Country Status (1)

Country Link
CN (1) CN106657792B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107105168A (en) * 2017-06-02 2017-08-29 哈尔滨市舍科技有限公司 Can virtual photograph shared viewing system
CN107241613A (en) * 2017-07-24 2017-10-10 哈尔滨市舍科技有限公司 Stadiums game situation multi-angle live broadcast system
CN107197165A (en) * 2017-07-28 2017-09-22 哈尔滨市舍科技有限公司 Unmanned plane self-heterodyne system and method
CN111164958A (en) * 2017-09-29 2020-05-15 深圳市大疆创新科技有限公司 System and method for processing and displaying image data based on pose information
CN108924590B (en) * 2018-06-27 2021-09-03 青岛一舍科技有限公司 Video playing and photographing system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753813A (en) * 2008-12-17 2010-06-23 索尼株式会社 Imaging apparatus, imaging method, and program
CN101937535A (en) * 2010-09-21 2011-01-05 上海杰图度假网络科技有限公司 Panoramic electronic map-based virtual tourism platform system
CN102256154A (en) * 2011-07-28 2011-11-23 中国科学院自动化研究所 Method and system for positioning and playing three-dimensional panoramic video
CN103543831A (en) * 2013-10-25 2014-01-29 梁权富 Head-mounted panoramic player
CN205345350U (en) * 2016-01-26 2016-06-29 高士雅 Outer view system of aircraft passenger compartment
CN106095774A (en) * 2016-05-25 2016-11-09 深圳市创驰蓝天科技发展有限公司 A kind of unmanned plane image panorama methods of exhibiting
CN106162204A (en) * 2016-07-06 2016-11-23 传线网络科技(上海)有限公司 Panoramic video generation, player method, Apparatus and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130053535A (en) * 2011-11-14 2013-05-24 한국과학기술연구원 The method and apparatus for providing an augmented reality tour inside a building platform service using wireless communication device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101753813A (en) * 2008-12-17 2010-06-23 索尼株式会社 Imaging apparatus, imaging method, and program
CN101937535A (en) * 2010-09-21 2011-01-05 上海杰图度假网络科技有限公司 Panoramic electronic map-based virtual tourism platform system
CN102256154A (en) * 2011-07-28 2011-11-23 中国科学院自动化研究所 Method and system for positioning and playing three-dimensional panoramic video
CN103543831A (en) * 2013-10-25 2014-01-29 梁权富 Head-mounted panoramic player
CN205345350U (en) * 2016-01-26 2016-06-29 高士雅 Outer view system of aircraft passenger compartment
CN106095774A (en) * 2016-05-25 2016-11-09 深圳市创驰蓝天科技发展有限公司 A kind of unmanned plane image panorama methods of exhibiting
CN106162204A (en) * 2016-07-06 2016-11-23 传线网络科技(上海)有限公司 Panoramic video generation, player method, Apparatus and system

Also Published As

Publication number Publication date
CN106657792A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN106550182B (en) Shared unmanned aerial vehicle viewing system
CN106657923B (en) Scene switching type shared viewing system based on position
CN106657792B (en) Shared viewing device
CN108495048B (en) Double-camera image acquisition equipment based on holder control
JP5865388B2 (en) Image generating apparatus and image generating method
CN106412439B (en) Image acquisition equipment, image acquisition method and controller
WO2013069047A1 (en) Image generation device, and image generation method
WO2013069049A1 (en) Image generation device, and image generation method
CN108650522B (en) Live broadcast system capable of instantly obtaining high-definition photos based on automatic control
CN108650494B (en) Live broadcast system capable of instantly obtaining high-definition photos based on voice control
CN108924590B (en) Video playing and photographing system
CN115103166A (en) Video processing method and terminal equipment
CN107197165A (en) Unmanned plane self-heterodyne system and method
CN106657922B (en) Scene switching type shared image processing system based on position
CN106357966A (en) Panoramic image photographing device and panoramic image acquiring method
US20240087157A1 (en) Image processing method, recording medium, image processing apparatus, and image processing system
CN108696724B (en) Live broadcast system capable of instantly obtaining high-definition photos
CN113240615A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN107925740A (en) Image management system, image management method and program
JP2017212510A (en) Image management device, program, image management system, and information terminal
CN110291777B (en) Image acquisition method, device and machine-readable storage medium
CN113450254B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111061123B (en) Rotary panoramic imaging system for tourist landscape display and use method
CN111242107B (en) Method and electronic device for setting virtual object in space
JP7225016B2 (en) AR Spatial Image Projection System, AR Spatial Image Projection Method, and User Terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 150016 Heilongjiang Province, Harbin Economic Development Zone haping Road District Dalian road and Xingkai road junction

Applicant after: HARBIN YISHE TECHNOLOGY Co.,Ltd.

Address before: 150016 Heilongjiang City, Harbin province Daoli District, quiet street, unit 54, unit 2, layer 4, No. 3

Applicant before: HARBIN YISHE TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 266100 Block C 200-43, Chuangke Street, Qingdao, 306 Ningxia Road, Laoshan District, Qingdao City, Shandong Province

Patentee after: QINGDAO YISPACE TECHNOLOGY Co.,Ltd.

Address before: 150016 Heilongjiang Province, Harbin Economic Development Zone haping Road District Dalian road and Xingkai road junction

Patentee before: HARBIN YISHE TECHNOLOGY Co.,Ltd.

CP03 Change of name, title or address
TR01 Transfer of patent right

Effective date of registration: 20240805

Address after: 150001 No. 92 West straight street, Nangang District, Heilongjiang, Harbin

Patentee after: HARBIN INSTITUTE OF TECHNOLOGY

Country or region after: China

Address before: Room 200-43, block C, Qingdao maker street, 306 Ningxia road, Laoshan District, Qingdao City, Shandong Province 266100

Patentee before: QINGDAO YISPACE TECHNOLOGY Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right