CN114205669B - Free view video playing method and device and electronic equipment - Google Patents
Free view video playing method and device and electronic equipment Download PDFInfo
- Publication number
- CN114205669B CN114205669B CN202111611869.6A CN202111611869A CN114205669B CN 114205669 B CN114205669 B CN 114205669B CN 202111611869 A CN202111611869 A CN 202111611869A CN 114205669 B CN114205669 B CN 114205669B
- Authority
- CN
- China
- Prior art keywords
- determining
- playing
- video
- screen
- angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 210000005252 bulbus oculi Anatomy 0.000 claims abstract description 18
- 230000000007 visual effect Effects 0.000 claims abstract description 6
- 238000004891 communication Methods 0.000 claims description 16
- 238000004519 manufacturing process Methods 0.000 claims description 6
- 230000004424 eye movement Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 abstract description 3
- 238000004590 computer program Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 5
- 210000001508 eye Anatomy 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/4728—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/65—Transmission of management data between client and server
- H04N21/658—Transmission by the client directed to the server
- H04N21/6587—Control parameters, e.g. trick play commands, viewpoint selection
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The embodiment of the invention relates to the technical field of data processing, and discloses a method and a device for playing a free view video and electronic equipment. The method comprises the following steps: when a user watches the free view video, tracking eyeball actions of the user to determine the focusing position of the user on the free view video on a playing screen; and switching the playing view angle of the free view angle video according to the concerned position. By the mode, the embodiment of the invention improves the efficiency of visual angle switching and improves the viewing experience of a user.
Description
Technical Field
The embodiment of the invention relates to the technical field of data processing, in particular to a method and a device for playing a free view video and electronic equipment.
Background
With the continuous development of network technology, the viewing amount of video is higher and higher. Compared with the common single-view video, the free-view video can enable a user to watch the video based on a plurality of different view angles, and the watching experience of the user is improved.
In the related art, in the process of viewing the freeview video by the user, if the user needs to change the viewing angle, the playing device must be manually operated. The playback device determines a new video playback angle of view from the manual operation signal of the user, and plays the freeview video based on the new video playback angle of view. However, the inventors have found in implementing embodiments of the present invention that: the process of switching the playing view angle of the free view angle video in the related art is complicated, so that the attention of a user is dispersed, and the watching experience of the user is influenced.
Disclosure of Invention
In view of the above problems, embodiments of the present invention provide a method, an apparatus, and an electronic device for playing a video at a free viewing angle, which are used for solving the problem in the prior art that a viewing angle switching process is complicated.
According to an aspect of an embodiment of the present invention, there is provided a method for playing a video from a free viewing angle, the method including:
when a user watches the free view video, tracking eyeball actions of the user to determine the focusing position of the user on the free view video on a playing screen;
and switching the playing view angle of the free view angle video according to the concerned position.
In an optional manner, the focus position includes a gaze point of a user on a playing screen, and the switching the playing view of the freeview video according to the focus position includes:
determining a trigger point for triggering the switching of the visual angles according to the gaze point, and determining the offset direction and the offset proportion of the trigger point relative to the central point of the playing screen;
determining the sum of adjacent machine position coverage angle differences between the target boundary machine position and the playing host machine position according to the offset direction;
and determining a viewing angle switching angle according to the sum of the adjacent machine position coverage angle differences and the offset proportion, and switching the playing viewing angle of the free viewing angle video according to the viewing angle switching angle.
In an optional manner, the determining the offset direction and the offset ratio of the trigger point with respect to the center point of the play screen includes:
determining the screen position of the trigger point, and determining the offset direction of the trigger point relative to the center point of the play screen according to the position relation between the screen position and the screen position of the center point of the play screen;
determining a play screen boundary point corresponding to the offset direction;
and determining the coordinate ratio of the trigger point and the play screen boundary point on a preset screen coordinate axis as the offset ratio of the trigger point relative to the play screen center point.
In an alternative manner, before determining the offset direction and the offset ratio of the trigger point with respect to the center point of the play screen, the method includes:
determining whether the trigger point is positioned in a preset trigger area;
and when the trigger point is positioned in the preset trigger area, executing the step of determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen.
In an alternative manner, the determining the viewing angle switching angle according to the sum of the adjacent machine coverage angle differences and the offset ratio includes:
calculating the product of the sum of the adjacent machine position coverage angle differences and the offset proportion;
the product is determined as the viewing angle switching angle.
In an alternative manner, before the eye movement of the user is tracked to determine the focus position of the user on the free-view video on the play screen while the user views the free-view video, the method further includes:
acquiring a free view video to be played from a video cloud and machine position information corresponding to the free view video to be played;
determining a main manufacturing position of the free view video to be played according to the position information;
and determining the making host position as a playing host position for playing the video with the free view angle to be played, and playing the video with the free view angle to be played according to the view angle of the playing host position.
In an optional manner, the determining, according to the offset direction, a sum of adjacent position coverage angle differences between the target boundary position and the play host position includes:
determining the number of total units and the coverage angle difference of all adjacent units according to the unit information;
determining a target boundary machine position according to the offset direction and the total machine position;
and determining the sum of adjacent position coverage angle differences between the target boundary position and the play host position according to the target boundary position and the adjacent position coverage angle differences.
According to another aspect of the embodiment of the present invention, there is provided a device for playing a free view video, the device including:
the system comprises a determining module, a display module and a display module, wherein the determining module is used for tracking eyeball actions of a user when the user watches the freeview video so as to determine the focus position of the user on the freeview video on a playing screen;
and the switching module is used for switching the playing view angle of the free view angle video according to the concerned position.
According to another aspect of an embodiment of the present invention, there is provided an electronic apparatus including: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is used for storing at least one executable instruction, and the executable instruction enables the processor to execute the operation of the method for playing the free view video.
According to still another aspect of the embodiments of the present invention, there is provided a computer readable storage medium having stored therein at least one executable instruction that, when executed on an electronic device, causes the electronic device to perform the operations of the above-described method for playing a freeview video.
In the embodiment of the invention, when a user watches the free view angle video, the projection point of the eyeball of the user on the playing screen is changed along with the change of the main attention area of the user to the free view angle video; based on the tracking of the eyeball motion of the user, the attention position of the user to the freeview video can be determined on the playing screen, so that the playing view of the freeview video is switched according to the attention position. It can be seen that the embodiment of the invention can intelligently identify the main attention area of the user on the free view video, further switch the play view angle of the free view video according to the identification result, realize the switch of the play view angle of the free view video without manual operation of the user, greatly simplify the switch process of the play view angle of the free view video and promote the watching experience of the user.
The foregoing description is only an overview of the technical solutions of the embodiments of the present invention, and may be implemented according to the content of the specification, so that the technical means of the embodiments of the present invention can be more clearly understood, and the following specific embodiments of the present invention are given for clarity and understanding.
Drawings
The drawings are only for purposes of illustrating embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
fig. 1 is a flow chart illustrating a method for playing a free view video according to an embodiment of the present invention;
FIG. 2 shows a schematic view of a camera position arrangement of a camera provided by an embodiment of the present invention;
FIG. 3 illustrates a schematic view of a user's gaze point route provided by an embodiment of the present invention;
FIG. 4 is a schematic view of a display screen according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a playback device for a video with a free view angle according to an embodiment of the present invention;
fig. 6 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein.
Fig. 1 is a flowchart of a method for playing a freeview video according to an embodiment of the present invention, where the method is performed by an electronic device. The memory of the electronic device is used for storing at least one executable instruction, and the executable instruction enables the processor of the electronic device to execute the operation of the free view video playing method.
As shown in fig. 1, the method comprises the steps of:
step 110: when a user watches the freeview video, eye movements of the user are tracked to determine the focus position of the user on the freeview video on a playing screen.
The freeview video is a video that can be viewed with multiple views. When the free view angle video is manufactured, a certain number of cameras are required to be deployed on a shooting site to acquire the video, and then the video acquired by each camera is synchronously aligned, spliced and transcoded. When video acquisition is carried out, the total camera number of the camera is firstly determined according to the shooting place and the camera position arrangement of the camera, and then the shooting focus is determined according to video content. For example, when the video content is a basketball game, the shooting focus may be determined to be basketball. Furthermore, each camera position of the camera can be numbered according to a certain sequence, and the current total coverage angle of the main machine position and all the camera positions and the coverage angle difference of each adjacent camera position can be calculated in real time according to the arrangement of each camera position and the distance from the camera focus. For example, after numbering, each camera position is C 1 、C 2 .....C N The real-time calculated host bit is C m (making host bit C) m Is C 1 To C N One of them), the total coverage angle of all the machine positions is PVT, and the coverage angle difference of each adjacent machine position is PV (i,j) I, j represent two adjacent machine positions.
Fig. 2 shows a schematic view of a camera position arrangement of a camera according to an embodiment of the present invention. As shown in fig. 2, the total machine bit number n=17, and the adjacent machine bit coverage angle difference PV is shown (i,j) I= 8,j =9, and the main machine bit is C 9 . Further, all adjacent machine coverage angle differences can be expressed as an array as follows.
PV 1,N-1 =[PV (1,2) PV (2,3) ...... PV (N-1,N) ]
The total angle of coverage PVT for all the stations can be calculated by the following formula.
After each camera in the shooting site (acquisition end) completes video acquisition, video streams acquired by each camera position and camera position information can be transmitted to a video cloud with low time delay. The machine bit information CIT at the current time T can comprise a total machine bit number N and a current master machine bit C m The coverage total angle PVT of all current machine positions and the coverage angle difference of each adjacent machine position are as follows:
CIT=CI(N,C m ,PVT,[PV (1,2) PV (2 ,3) ...... PV (N-1,N) ])
after receiving the video stream and the machine position information returned by the shooting site, the video cloud can finish synchronous alignment, splicing and transcoding of all paths of machine position video streams based on the video frame time stamp of an NTP (Network Time Protocol, network clock protocol) clock, and sends the transcoded video stream and machine position information to the terminal for playing. After receiving the video stream and the machine position information sent by the video cloud, the terminal can play the video stream according to the machine position information. When playing the video stream according to the machine position information, the playing machine position can be set to be the making machine position C m Default to make host bit C m Video play is performed from the view angle of (a).
When a user views the freeview video, as the main focus area of the freeview video changes, the projection point of the eyeball of the user on the playing screen also changes. Thus, based on tracking of the eye movements of the user, the user's focus position on the freeview video can be determined on the play screen. The user's focus on the freeview video may include, for example, the user's gaze point on the play screen. Furthermore, the playing screen of the playing device may be gridded, so that each point on the playing screen corresponds to a unique screen coordinate.
Step 120: and switching the playing view angle of the free view angle video according to the concerned position.
According to the focus position of the user on the free view video, the current optimal playing view angle can be determined in real time, and the playing view angle of the free view video is switched based on the current optimal playing view angle. Further, when the play view angle of the freeview video is switched according to the attention position, a trigger point for triggering the view angle switching can be determined according to the gaze point of the user on the play screen. For example, the gaze point of the user on the playing screen may be extracted by eye tracking technology, the gaze point constituting the basic unit of measure, one gaze point being equal to one sampling point captured by the eye tracker. Further, a gaze point having a duration exceeding a first preset duration may be determined as a gaze point, which is a point of fall where the eye is locked on the screen; and determining the fixation point with the duration exceeding the second preset duration as a trigger point.
Fig. 3 shows a schematic view of a gaze point route of a user according to an embodiment of the present invention. As shown in fig. 3, the gaze point route schematic includes (1), (2), (3), (4), and (5) for a total of 5 points, where (1), (2), (3), (4), and (5) are gaze points, and the duration of (1) and (5) exceeds a second preset duration as trigger points.
After determining the trigger point for triggering the view angle switching, determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen, determining the sum of adjacent position coverage angle differences between the target boundary position and the playing host position according to the offset direction, determining the view angle switching angle according to the sum of adjacent position coverage angle differences and the offset proportion, and switching the playing view angle of the free view video according to the view angle switching angle. Before determining the offset direction and the offset ratio of the trigger point relative to the center point of the playing screen, whether the trigger point is located in a preset trigger area or not can be determined, and when the trigger point is located in the preset trigger area, the offset direction and the offset ratio of the trigger point relative to the center point of the playing screen are determined. Further, according to the video viewing habit of the free viewing angle of the user, a thermal map area G may be set on the playing screen, and an area other than the thermal map area G on the playing screen may be set as a preset trigger area. The thermal map area G is generally a certain area near the center of the play screen, and a trigger point in this area does not trigger the view angle switching of the freeview video.
When determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen, firstly determining the screen position of the trigger point, and determining the offset direction of the trigger point relative to the center point of the playing screen according to the position relation between the screen position of the trigger point and the screen position of the center point of the playing screen; further, a play screen boundary point corresponding to the offset direction is determined, and the coordinate ratio of the trigger point to the play screen boundary point on a preset screen coordinate axis is determined as the offset ratio of the trigger point relative to the play screen center point, wherein the preset screen coordinate axis can be a coordinate axis in the horizontal direction or a coordinate axis in the vertical direction.
For example, the preset screen coordinate axis is a coordinate axis in the horizontal Direction, the coordinate of the trigger point is C (n, m), n is the coordinate of the trigger point in the horizontal Direction, m is the coordinate of the trigger point in the vertical Direction, the coordinate of the center point of the play screen is (0, 0), and if n is less than 0, the offset Direction D (Direction) =l (Left) of the trigger point relative to the center point of the play screen; if n > 0, the offset Direction D (Direction) =r (Right) of the trigger point with respect to the center point of the play screen. L represents the offset of the trigger point to the left of the center point of the play screen, and R represents the offset of the trigger point to the right of the center point of the play screen. When the coordinates of the boundary point of the playing screen on the preset screen coordinate axis are determined, if the trigger point deviates to the left side of the central point of the playing screen, determining the horizontal coordinates of the boundary point on the left side of the playing screen as the coordinates of the boundary point of the playing screen on the preset screen coordinate axis; if the trigger point is deviated to the right side of the center point of the playing screen, determining the horizontal coordinate of the right side boundary point of the playing screen as the coordinate of the boundary point of the playing screen on the preset screen coordinate axis.
Fig. 4 shows a schematic diagram of a display screen according to an embodiment of the present invention. As shown in fig. 4, the X-axis is a horizontal coordinate axis, the Y-axis is a vertical coordinate axis, X1 is a boundary point on the right side of the play screen, and X2 is a boundary point on the left side of the play screen. When the trigger point is offset to the left of the center point of the play screen, the offset ratio s=n/x 2; when the trigger point is shifted to the right of the play screen center point, the shift ratio s=n/x 1.
When a user watches the freeview video, the eyeball action of the user is tracked so as to acquire the freeview video to be played and the machine position information corresponding to the freeview video to be played from the video cloud before the focusing position of the user on the freeview video is determined on the playing screen, the production host position of the freeview video to be played is determined according to the machine position information, the production host position is determined to be a playing host position for playing the freeview video to be played, and the freeview video to be played is played according to the viewing angle of the playing host position. When the sum of adjacent machine position coverage angle differences between the target boundary machine position and the playing host machine position is determined according to the offset direction, the total machine position number and all adjacent machine position coverage angle differences can be determined according to the machine position information, and the target boundary machine position is determined according to the offset direction and the total machine position number; and determining the sum of adjacent position coverage angle differences between the target boundary position and the playing host position according to the target boundary position and all adjacent position coverage angle differences.
For example, when determining the sum of adjacent position coverage angle differences between the target boundary position and the play host position according to the offset direction, if the trigger point is offset to the left of the play screen center point, the target boundary position may be determined as C 1 The sum PVL of adjacent bit coverage angle differences between the target boundary bit and the playback host bit can be expressed asIf the trigger point is shifted to the right side of the center point of the playing screen, the target boundary machine position can be determined as C N The sum PVL of adjacent position coverage angle differences between the target boundary position and the playback host position can be expressed as +.>When the viewing angle switching angle is determined from the sum of adjacent machine position coverage angle differences and the offset ratio, the product of the sum of adjacent machine position coverage angle differences and the offset ratio may be calculated, and the product may be determined as the viewing angle switching angle. For example, if the trigger point is shifted to the left of the center point of the play screen, the angle of view switching angle sa=pvl×s; if the trigger point is shifted to the left of the center point of the play screen, the angle of view switching angle sa=pvr×s. Further, the terminal playing device can be smoothly switched to a new playing host position based on the angle of view switching.
In the embodiment of the invention, when a user watches the free view angle video, the projection point of the eyeball of the user on the playing screen is changed along with the change of the main attention area of the user to the free view angle video; based on the tracking of the eyeball motion of the user, the attention position of the user to the freeview video can be determined on the playing screen, so that the playing view of the freeview video is switched according to the attention position. It can be seen that the embodiment of the invention can intelligently identify the main attention area of the user on the free view video, further switch the play view angle of the free view video according to the identification result, realize the switch of the play view angle of the free view video without manual operation of the user, greatly simplify the switch process of the play view angle of the free view video and promote the watching experience of the user.
Fig. 5 is a schematic structural diagram of a playback device for a freeview video according to an embodiment of the present invention. As shown in fig. 5, the apparatus 300 includes: a determination module 310 and a switching module 320.
Wherein, the determining module 310 is configured to track, when the user views the freeview video, eye movements of the user to determine a focus position of the user on the freeview video on the play screen; the switching module 320 is configured to switch a play view of the freeview video according to the focus position.
In an alternative manner, the focus location includes a gaze point of the user on the play screen, and the switching module 320 is configured to:
determining a trigger point for triggering the switching of the visual angles according to the gaze point, and determining the offset direction and the offset proportion of the trigger point relative to the central point of the playing screen;
determining the sum of adjacent machine position coverage angle differences between the target boundary machine position and the playing host machine position according to the offset direction;
and determining a viewing angle switching angle according to the sum of the adjacent machine position coverage angle differences and the offset proportion, and switching the playing viewing angle of the free viewing angle video according to the viewing angle switching angle.
In an alternative manner, the switching module 320 is configured to:
determining the screen position of the trigger point, and determining the offset direction of the trigger point relative to the center point of the play screen according to the position relation between the screen position and the screen position of the center point of the play screen;
determining a play screen boundary point corresponding to the offset direction;
and determining the coordinate ratio of the trigger point and the play screen boundary point on a preset screen coordinate axis as the offset ratio of the trigger point relative to the play screen center point.
In an alternative manner, the switching module 320 is configured to:
before determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen, determining whether the trigger point is positioned in a preset trigger area;
when the trigger point is located in the preset trigger area, determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen.
In an alternative manner, the switching module 320 is configured to:
calculating the product of the sum of the adjacent machine position coverage angle differences and the offset proportion;
the product is determined as the viewing angle switching angle.
In an alternative manner, the switching module 320 is configured to:
when the user watches the free view angle video, tracking the eyeball action of the user so as to acquire the free view angle video to be played and the position information corresponding to the free view angle video to be played from a video cloud before determining the focusing position of the user on the free view angle video on a playing screen;
determining a main manufacturing position of the free view video to be played according to the position information;
and determining the making host position as a playing host position for playing the video with the free view angle to be played, and playing the video with the free view angle to be played according to the view angle of the playing host position.
In an alternative manner, the switching module 320 is configured to:
determining the number of total units and the coverage angle difference of all adjacent units according to the unit information;
determining a target boundary machine position according to the offset direction and the total machine position;
and determining the sum of adjacent position coverage angle differences between the target boundary position and the play host position according to the target boundary position and the adjacent position coverage angle differences.
In the embodiment of the invention, when a user watches the free view angle video, the projection point of the eyeball of the user on the playing screen is changed along with the change of the main attention area of the user to the free view angle video; based on the tracking of the eyeball motion of the user, the attention position of the user to the freeview video can be determined on the playing screen, so that the playing view of the freeview video is switched according to the attention position. It can be seen that the embodiment of the invention can intelligently identify the main attention area of the user on the free view video, further switch the play view angle of the free view video according to the identification result, realize the switch of the play view angle of the free view video without manual operation of the user, greatly simplify the switch process of the play view angle of the free view video and promote the watching experience of the user.
Fig. 6 shows a schematic structural diagram of an electronic device according to an embodiment of the present invention, and the specific embodiment of the present invention is not limited to the specific implementation of the electronic device.
As shown in fig. 6, the electronic device may include: a processor 402, a communication interface (Communications Interface) 404, a memory 406, and a communication bus 408.
Wherein: processor 402, communication interface 404, and memory 406 communicate with each other via communication bus 408. A communication interface 404 for communicating with network elements of other devices, such as clients or other servers. The processor 402 is configured to execute the program 410, and may specifically perform the relevant steps in the embodiment of the method for playing a freeview video.
In particular, program 410 may include program code including computer-executable instructions.
The processor 402 may be a central processing unit CPU, or a specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included in the electronic device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
Memory 406 for storing programs 410. Memory 406 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
Program 410 may be specifically invoked by processor 402 to cause an electronic device to:
when a user watches the free view video, tracking eyeball actions of the user to determine the focusing position of the user on the free view video on a playing screen;
and switching the playing view angle of the free view angle video according to the concerned position.
In an alternative way, the focus location includes the user's gaze point on the play screen, and the program 410 is invoked by the processor 402 to cause the electronic device to:
determining a trigger point for triggering the switching of the visual angles according to the gaze point, and determining the offset direction and the offset proportion of the trigger point relative to the central point of the playing screen;
determining the sum of adjacent machine position coverage angle differences between the target boundary machine position and the playing host machine position according to the offset direction;
and determining a viewing angle switching angle according to the sum of the adjacent machine position coverage angle differences and the offset proportion, and switching the playing viewing angle of the free viewing angle video according to the viewing angle switching angle.
In an alternative, the program 410 is invoked by the processor 402 to cause the electronic device to:
determining the screen position of the trigger point, and determining the offset direction of the trigger point relative to the center point of the play screen according to the position relation between the screen position and the screen position of the center point of the play screen;
determining a play screen boundary point corresponding to the offset direction;
and determining the coordinate ratio of the trigger point and the play screen boundary point on a preset screen coordinate axis as the offset ratio of the trigger point relative to the play screen center point.
In an alternative way, the focus location includes the user's gaze point on the play screen, and the program 410 is invoked by the processor 402 to cause the electronic device to:
before determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen, determining whether the trigger point is positioned in a preset trigger area;
and when the trigger point is positioned in the preset trigger area, executing the step of determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen.
In an alternative way, the focus location includes the user's gaze point on the play screen, and the program 410 is invoked by the processor 402 to cause the electronic device to:
calculating the product of the sum of the adjacent machine position coverage angle differences and the offset proportion;
the product is determined as the viewing angle switching angle.
In an alternative way, the focus location includes the user's gaze point on the play screen, and the program 410 is invoked by the processor 402 to cause the electronic device to:
when the user watches the free view angle video, tracking the eyeball action of the user so as to acquire the free view angle video to be played and the position information corresponding to the free view angle video to be played from a video cloud before determining the focusing position of the user on the free view angle video on a playing screen;
determining a main manufacturing position of the free view video to be played according to the position information;
and determining the making host position as a playing host position for playing the video with the free view angle to be played, and playing the video with the free view angle to be played according to the view angle of the playing host position.
In an alternative way, the focus location includes the user's gaze point on the play screen, and the program 410 is invoked by the processor 402 to cause the electronic device to:
determining the number of total units and the coverage angle difference of all adjacent units according to the unit information;
determining a target boundary machine position according to the offset direction and the total machine position;
and determining the sum of adjacent position coverage angle differences between the target boundary position and the play host position according to the target boundary position and the adjacent position coverage angle differences.
In the embodiment of the invention, when a user watches the free view angle video, the projection point of the eyeball of the user on the playing screen is changed along with the change of the main attention area of the user to the free view angle video; based on the tracking of the eyeball motion of the user, the attention position of the user to the freeview video can be determined on the playing screen, so that the playing view of the freeview video is switched according to the attention position. It can be seen that the embodiment of the invention can intelligently identify the main attention area of the user on the free view video, further switch the play view angle of the free view video according to the identification result, realize the switch of the play view angle of the free view video without manual operation of the user, greatly simplify the switch process of the play view angle of the free view video and promote the watching experience of the user.
The embodiment of the invention provides a computer readable storage medium, which stores at least one executable instruction, and when the executable instruction runs on electronic equipment, the electronic equipment executes the method for playing the free view video in any method embodiment.
The embodiment of the invention provides a free view video playing device which is used for executing the free view video playing method.
The embodiment of the invention provides a computer program which can be called by a processor to enable an electronic device to execute the method for playing the free view video in any of the method embodiments.
An embodiment of the present invention provides a computer program product, where the computer program product includes a computer program stored on a computer readable storage medium, where the computer program includes program instructions, when the program instructions are executed on a computer, cause the computer to execute the method for playing a freeview video in any of the above method embodiments.
The algorithms or displays presented herein are not inherently related to any particular computer, virtual system, or other apparatus. Various general-purpose systems may also be used with the teachings herein. The required structure for a construction of such a system is apparent from the description above. In addition, embodiments of the present invention are not directed to any particular programming language. It will be appreciated that the teachings of the present invention described herein may be implemented in a variety of programming languages, and the above description of specific languages is provided for disclosure of enablement and best mode of the present invention.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the above description of exemplary embodiments of the invention, various features of the embodiments of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be construed as reflecting the intention that: i.e., the claimed invention requires more features than are expressly recited in each claim.
Those skilled in the art will appreciate that the modules in the apparatus of the embodiments may be adaptively changed and disposed in one or more apparatuses different from the embodiments. The modules or units or components of the embodiments may be combined into one module or unit or component, and they may be divided into a plurality of sub-modules or sub-units or sub-components. Any combination of all features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be used in combination, except insofar as at least some of such features and/or processes or units are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names. The steps in the above embodiments should not be construed as limiting the order of execution unless specifically stated.
Claims (9)
1. A method for playing a free view video, the method comprising:
when a user watches the free view video, tracking eyeball actions of the user to determine the focusing position of the user on the free view video on a playing screen; the focus position comprises a gaze point of a user on a playing screen;
switching the play view angle of the free view video according to the focus position, including: determining a trigger point for triggering the switching of the visual angles according to the gaze point, and determining the offset direction and the offset proportion of the trigger point relative to the central point of the playing screen; determining the sum of adjacent machine position coverage angle differences between the target boundary machine position and the playing host machine position according to the offset direction; and determining a viewing angle switching angle according to the sum of the adjacent machine position coverage angle differences and the offset proportion, and switching the playing viewing angle of the free viewing angle video according to the viewing angle switching angle.
2. The method of claim 1, wherein determining the offset direction and offset ratio of the trigger point relative to the center point of the play screen comprises:
determining the screen position of the trigger point, and determining the offset direction of the trigger point relative to the center point of the play screen according to the position relation between the screen position and the screen position of the center point of the play screen;
determining a play screen boundary point corresponding to the offset direction;
and determining the coordinate ratio of the trigger point and the play screen boundary point on a preset screen coordinate axis as the offset ratio of the trigger point relative to the play screen center point.
3. The method according to claim 1 or 2, characterized in that before said determining the offset direction and the offset ratio of the trigger point with respect to the center point of the play screen, the method comprises:
determining whether the trigger point is positioned in a preset trigger area;
and when the trigger point is positioned in the preset trigger area, executing the step of determining the offset direction and the offset proportion of the trigger point relative to the center point of the playing screen.
4. The method of claim 1, wherein said determining a viewing angle switching angle from a sum of said adjacent machine position coverage angle differences and said offset ratio comprises:
calculating the product of the sum of the adjacent machine position coverage angle differences and the offset proportion;
the product is determined as the viewing angle switching angle.
5. The method of claim 1, wherein tracking eye movements of a user while the user views the freeview video to determine a location on a playback screen where the user is focused on the freeview video further comprises:
acquiring a free view video to be played from a video cloud and machine position information corresponding to the free view video to be played;
determining a main manufacturing position of the free view video to be played according to the position information;
and determining the making host position as a playing host position for playing the video with the free view angle to be played, and playing the video with the free view angle to be played according to the view angle of the playing host position.
6. The method of claim 5, wherein determining the sum of adjacent position coverage angle differences between the target boundary position and the play host position based on the offset direction comprises:
determining the number of total units and the coverage angle difference of all adjacent units according to the unit information;
determining a target boundary machine position according to the offset direction and the total machine position;
and determining the sum of adjacent position coverage angle differences between the target boundary position and the play host position according to the target boundary position and the adjacent position coverage angle differences.
7. A device for playing a freeview video, the device comprising:
the system comprises a determining module, a display module and a display module, wherein the determining module is used for tracking eyeball actions of a user when the user watches the freeview video so as to determine the focus position of the user on the freeview video on a playing screen; the focus position comprises a gaze point of a user on a playing screen;
the switching module is configured to switch a play view angle of the free view video according to the attention position, and includes: determining a trigger point for triggering the switching of the visual angles according to the gaze point, and determining the offset direction and the offset proportion of the trigger point relative to the central point of the playing screen; determining the sum of adjacent machine position coverage angle differences between the target boundary machine position and the playing host machine position according to the offset direction; and determining a viewing angle switching angle according to the sum of the adjacent machine position coverage angle differences and the offset proportion, and switching the playing viewing angle of the free viewing angle video according to the viewing angle switching angle.
8. An electronic device, comprising: the device comprises a processor, a memory, a communication interface and a communication bus, wherein the processor, the memory and the communication interface complete communication with each other through the communication bus;
the memory is configured to store at least one executable instruction that causes the processor to perform the operations of the method for playing a freeview video according to any one of claims 1-6.
9. A computer readable storage medium, wherein at least one executable instruction is stored in the storage medium, the executable instruction when executed on an electronic device, causing the electronic device to perform the operations of the method for playing a freeview video according to any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111611869.6A CN114205669B (en) | 2021-12-27 | 2021-12-27 | Free view video playing method and device and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111611869.6A CN114205669B (en) | 2021-12-27 | 2021-12-27 | Free view video playing method and device and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114205669A CN114205669A (en) | 2022-03-18 |
CN114205669B true CN114205669B (en) | 2023-10-17 |
Family
ID=80656616
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111611869.6A Active CN114205669B (en) | 2021-12-27 | 2021-12-27 | Free view video playing method and device and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114205669B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114979732B (en) * | 2022-05-12 | 2023-10-20 | 咪咕数字传媒有限公司 | Video stream pushing method and device, electronic equipment and medium |
CN115103213B (en) * | 2022-06-10 | 2023-10-17 | 咪咕视讯科技有限公司 | Information processing method, apparatus, device and computer readable storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102342100A (en) * | 2009-03-09 | 2012-02-01 | 思科技术公司 | System and method for providing three dimensional imaging in network environment |
CN102347043A (en) * | 2010-07-30 | 2012-02-08 | 腾讯科技(北京)有限公司 | Method for playing multi-angle video and system |
CN103108126A (en) * | 2013-01-21 | 2013-05-15 | Tcl集团股份有限公司 | Video interactive system, method, interactive glasses and terminals |
CN104740874A (en) * | 2015-03-26 | 2015-07-01 | 广州博冠信息科技有限公司 | Method and system for playing videos in two-dimension game scene |
CN106447788A (en) * | 2016-09-26 | 2017-02-22 | 北京疯景科技有限公司 | Watching angle indication method and device |
CN106791794A (en) * | 2016-12-30 | 2017-05-31 | 重庆卓美华视光电有限公司 | A kind of display device, image processing method and device |
KR20180079051A (en) * | 2016-12-30 | 2018-07-10 | 엘지전자 주식회사 | Mobile terninal and method for controlling the same |
CN109799899A (en) * | 2017-11-17 | 2019-05-24 | 腾讯科技(深圳)有限公司 | Interaction control method, device, storage medium and computer equipment |
CN111866525A (en) * | 2020-09-23 | 2020-10-30 | 腾讯科技(深圳)有限公司 | Multi-view video playing control method and device, electronic equipment and storage medium |
CN113170231A (en) * | 2019-04-11 | 2021-07-23 | 华为技术有限公司 | Method and device for controlling playing of video content following user motion |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170186234A1 (en) * | 2015-12-27 | 2017-06-29 | Le Holdings (Beijing) Co., Ltd. | Method and device for free viewing of three-dimensional video |
-
2021
- 2021-12-27 CN CN202111611869.6A patent/CN114205669B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102342100A (en) * | 2009-03-09 | 2012-02-01 | 思科技术公司 | System and method for providing three dimensional imaging in network environment |
CN102347043A (en) * | 2010-07-30 | 2012-02-08 | 腾讯科技(北京)有限公司 | Method for playing multi-angle video and system |
CN103108126A (en) * | 2013-01-21 | 2013-05-15 | Tcl集团股份有限公司 | Video interactive system, method, interactive glasses and terminals |
CN104740874A (en) * | 2015-03-26 | 2015-07-01 | 广州博冠信息科技有限公司 | Method and system for playing videos in two-dimension game scene |
CN106447788A (en) * | 2016-09-26 | 2017-02-22 | 北京疯景科技有限公司 | Watching angle indication method and device |
CN106791794A (en) * | 2016-12-30 | 2017-05-31 | 重庆卓美华视光电有限公司 | A kind of display device, image processing method and device |
KR20180079051A (en) * | 2016-12-30 | 2018-07-10 | 엘지전자 주식회사 | Mobile terninal and method for controlling the same |
CN109799899A (en) * | 2017-11-17 | 2019-05-24 | 腾讯科技(深圳)有限公司 | Interaction control method, device, storage medium and computer equipment |
CN113170231A (en) * | 2019-04-11 | 2021-07-23 | 华为技术有限公司 | Method and device for controlling playing of video content following user motion |
CN111866525A (en) * | 2020-09-23 | 2020-10-30 | 腾讯科技(深圳)有限公司 | Multi-view video playing control method and device, electronic equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
基于数字图像处理的眼球控制精度提高方法;严德赛 等;计算机应用(第10期);第267-270页 * |
Also Published As
Publication number | Publication date |
---|---|
CN114205669A (en) | 2022-03-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP2022119858A5 (en) | ||
JP2019512769A5 (en) | ||
CN114205669B (en) | Free view video playing method and device and electronic equipment | |
CN110675506B (en) | System, method and equipment for realizing three-dimensional augmented reality of multi-channel video fusion | |
CN107027042B (en) | Multi-GPU-based panoramic real-time video stream processing method and device | |
CN108932051A (en) | augmented reality image processing method, device and storage medium | |
CN107646126A (en) | Camera Attitude estimation for mobile device | |
CN108537721A (en) | Processing method, device and the electronic equipment of panoramic picture | |
US11272153B2 (en) | Information processing apparatus, method for controlling the same, and recording medium | |
JP2020086983A (en) | Image processing device, image processing method, and program | |
CN105933665B (en) | A kind of method and device for having access to camera video | |
EP3595323A1 (en) | Video playing method for synchronously displaying ar information | |
CN108280851B (en) | Depth map generating device | |
JP2019036791A (en) | Image processing apparatus, image processing system, control method, and program | |
CN111327876A (en) | Target tracking display method and device, electronic equipment and machine-readable storage medium | |
CN110263615A (en) | Interaction processing method, device, equipment and client in vehicle shooting | |
CN108449545B (en) | Monitoring system and application method thereof | |
CN112702643B (en) | Barrage information display method and device and mobile terminal | |
CN115278193A (en) | Panoramic video distribution method, device, equipment and computer storage medium | |
CN113515187B (en) | Virtual reality scene generation method and network side equipment | |
CN116523962B (en) | Visual tracking method, device, system, equipment and medium for target object | |
KR102176805B1 (en) | System and method for providing virtual reality contents indicated view direction | |
CN108985275B (en) | Augmented reality equipment and display tracking method and device of electronic equipment | |
CN118052867A (en) | Positioning method, terminal equipment, server and storage medium | |
CN111949114B (en) | Image processing method, device and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |