[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN104375626B - Show processing method, the device and system of sign - Google Patents

Show processing method, the device and system of sign Download PDF

Info

Publication number
CN104375626B
CN104375626B CN201310354098.6A CN201310354098A CN104375626B CN 104375626 B CN104375626 B CN 104375626B CN 201310354098 A CN201310354098 A CN 201310354098A CN 104375626 B CN104375626 B CN 104375626B
Authority
CN
China
Prior art keywords
mobile terminal
determining
dimensional
coordinate system
central axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310354098.6A
Other languages
Chinese (zh)
Other versions
CN104375626A (en
Inventor
刘兆祥
胡伟
张爱东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Migu Cultural Technology Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201310354098.6A priority Critical patent/CN104375626B/en
Priority to PCT/CN2014/070140 priority patent/WO2015021746A1/en
Publication of CN104375626A publication Critical patent/CN104375626A/en
Application granted granted Critical
Publication of CN104375626B publication Critical patent/CN104375626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
    • H04M1/72409User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
    • H04M1/72415User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories for remote control of appliances

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the present invention provides a kind of processing method, device and system for showing sign, and this method includes:Obtaining includes the mobile terminal and the human hand and the current scene image in imaging region of human hand gripping;According to the current scene image, determine that the mobile terminal points to the direction indication in the imaging region;According to the direction indication and scope of the imaging region in the current scene image, the relative position being located between the direction indication and the imaging region in the imaging region is determined;The display signal provided according to the mobile terminal, sign is shown in the relative position;This method is easy to user to make instruction on imaging region using any mobile terminal at any time, and versatility is preferable, and simple operation, and the work and study to user offer convenience.

Description

Processing method, device and system for displaying indication identification
Technical Field
The embodiment of the invention relates to the technical field of electronic information, in particular to a method, a device and a system for processing a display indication identifier.
Background
In the current meeting scene, the projector and the projection screen are the most commonly used information display means, and when the participants explain and discuss according to the information displayed by the projection screen, the laser dots are often displayed on the projection screen through the laser pen to indicate the projection content. However, in some meeting scenes, the laser pen is often overlooked or limited in number, which causes inconvenience, and therefore, if a mobile terminal, such as a mobile phone, a PDA (Personal digital assistant), etc., which can be carried by people in a daily life has an indication function similar to the laser pen, it is greatly convenient for people to work.
In order to make the mobile terminal have an indication function similar to a laser pen, in the prior art, a laser diode is usually built in the mobile terminal, and a corresponding driving circuit is configured, so that the laser diode emits laser light, and an indication effect appears on a projection screen.
However, when the above scheme is implemented, the existing hardware structure and software program of the mobile terminal need to be changed, and the hardware structure and software program of the current different types of mobile terminals are different, so that different changing schemes need to be designed for the different types of mobile terminals, which consumes manpower and material resources, and has poor universality.
Disclosure of Invention
The embodiment of the invention provides a processing method, a device and a system for displaying an indication identifier, which are used for indicating a presentation area by using a mobile terminal.
In a first aspect, an embodiment of the present invention provides a processing method for displaying an indication identifier, including:
acquiring a mobile terminal held by a hand, the hand and a current scene image of a presentation area;
determining the indication direction of the mobile terminal pointing to the presentation area according to the current scene image;
determining a correlation position between the indication direction and the presentation area in the presentation area according to the indication direction and the range of the presentation area in the current scene image;
and displaying an indication mark at the associated position according to a display signal provided by the mobile terminal.
With reference to the first aspect, in a first implementation, the determining, according to the current scene image, a pointing direction of the mobile terminal to the presentation area includes:
determining the position of the human hand in the current scene image according to preset human hand recognition characteristics;
determining a feature plane of the mobile terminal held by the hand based on the position of the hand by using an edge detection operator, wherein the feature plane is a geometric plane figure of the mobile terminal with a central axis, or the feature plane is a geometric plane figure of the mobile terminal with the central axis;
and determining a central axis of the characteristic plane of the mobile terminal as the indication direction according to the characteristic plane of the mobile terminal.
With reference to the first aspect or the first implementation manner of the first aspect, in a second implementation manner, before displaying an indication identifier at the associated position according to a display signal provided by the mobile terminal, the display method further includes:
and receiving the display signal sent by the mobile terminal.
With reference to the second implementation manner of the first aspect, in a third implementation manner, determining, according to the feature plane of the mobile terminal, a central axis of the feature plane of the mobile terminal as the indication direction includes:
determining a two-dimensional area equation of the geometric plane graph with the central axis by adopting a linear detection algorithm;
according to the two-dimensional area equation, acquiring a three-dimensional area equation of the geometric plane figure with the central axis in a standard three-dimensional coordinate system;
and determining a three-dimensional linear equation of the central axis of the geometric planar graph with the central axis in the standard three-dimensional coordinate system as the indication direction according to the three-dimensional region equation of the geometric planar graph with the central axis in the standard three-dimensional coordinate system.
With reference to the third implementation manner of the first aspect, in a fourth implementation manner, determining, as the indication direction, a three-dimensional linear equation of the central axis of the geometric planar figure with the central axis in a standard three-dimensional coordinate system according to a three-dimensional region equation of the geometric planar figure with the central axis in the standard three-dimensional coordinate system includes:
when the geometric plane figure with the central axis has at least two central axes; acquiring the current indication direction of a built-in electronic compass of the mobile terminal;
selecting a first central axis parallel to the current indication direction of the electronic compass from at least two central axes of the geometric planar graph with the central axes, and determining a three-dimensional linear equation of the first central axis in a standard three-dimensional coordinate system as the indication direction according to a three-dimensional area equation of the geometric planar graph with the central axes in the standard three-dimensional coordinate system.
With reference to the third or fourth implementation manner of the first aspect, in a fifth implementation manner, the determining, according to the indication direction and a range of the presentation area in the current scene image, a relevant position between the indication direction and the presentation area in the presentation area includes:
and calculating according to a three-dimensional linear equation of the central axis of the geometric plane figure with the central axis in the standard three-dimensional coordinate system and a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system to obtain an intersection point coordinate in the standard three-dimensional coordinate system, and determining the intersection point coordinate as the associated position.
With reference to any one of the third to fifth implementation manners of the first aspect, in a sixth implementation manner, before the determining, according to the current scene image, the pointing direction of the mobile terminal, the control method further includes:
acquiring calibration parameter information according to a preset sampling detection point;
determining the standard three-dimensional coordinate system according to the calibration parameter information;
and determining a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system according to the calibration parameter information and the standard three-dimensional coordinate system.
In a second aspect, an embodiment of the present invention provides a processing apparatus for displaying an indication identifier, including:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a mobile terminal held by a human hand and a current scene image of the human hand and a presentation area;
a first determining module, configured to determine, according to the current scene image, an indication direction in which the mobile terminal points to the presentation area;
a second determining module, configured to determine, according to the indication direction and a range of the presentation area in the current scene image, an associated position, in the presentation area, between the indication direction and the presentation area;
and the control module is used for displaying the indication mark at the associated position according to the display signal provided by the mobile terminal.
With reference to the second aspect, in a first embodiment, the first determining module includes:
the human hand recognition unit is used for determining the position of a human hand in the current scene image according to preset human hand recognition characteristics;
a feature plane determining unit, configured to determine, based on a position of the hand, a feature plane of the mobile terminal held by the hand by using an edge detection operator, where the feature plane is a geometric plane figure of the mobile terminal having an axis, or the feature plane is a geometric plane figure of the mobile terminal having an axis and displayed on the mobile terminal;
and the indication direction determining unit is used for determining a central axis of the characteristic plane of the mobile terminal as the indication direction according to the characteristic plane of the mobile terminal.
With reference to the second aspect or the first embodiment of the second aspect, in a second embodiment, the processing apparatus further includes:
and the receiving module is used for receiving the display signal sent by the mobile terminal.
With reference to the second aspect, in a third embodiment, the indication direction determination unit includes:
the first equation determining subunit is used for determining a two-dimensional area equation of the geometric planar graph with the central axis by adopting a linear detection algorithm;
the second equation determining subunit is used for acquiring a three-dimensional area equation of the geometric plane figure with the central axis in a standard three-dimensional coordinate system according to the two-dimensional area equation;
and the indication direction determining subunit is configured to determine, according to the three-dimensional region equation of the geometric planar graph with the central axis in the standard three-dimensional coordinate system, a three-dimensional linear equation of the central axis of the geometric planar graph with the central axis in the standard three-dimensional coordinate system as the indication direction.
With reference to the third embodiment of the second aspect, in a fourth embodiment, when the geometric planar figure with central axes has at least two central axes; the indication direction determining subunit is specifically configured to obtain a current indication direction of a built-in electronic compass of the mobile terminal; selecting a first central axis parallel to the current indication direction of the electronic compass from the at least two central axes of the geometric plane graph with the central axes, and determining a three-dimensional linear equation of the first central axis in a standard three-dimensional coordinate system as the indication direction according to a three-dimensional area equation of the feature plane in the standard three-dimensional coordinate system.
With reference to the third or fourth embodiment of the second aspect, in a fifth embodiment, the second determining module is specifically configured to perform an operation according to a three-dimensional straight-line equation of the central axis of the geometric planar graph with the central axis in the standard three-dimensional coordinate system and a three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system, obtain intersection coordinates in the standard three-dimensional coordinate system, and determine the intersection coordinates as the associated position.
With reference to any one of the third to fourth embodiments of the second aspect, in a sixth embodiment, the treatment device further comprises:
the third determining module is used for acquiring calibration parameter information according to a preset sampling detection point before determining the indication direction of the mobile terminal according to the current scene image; determining the standard three-dimensional coordinate system according to the calibration parameter information; and determining a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system according to the calibration parameter information and the standard three-dimensional coordinate system.
In a third aspect, an embodiment of the present invention provides a processing system for displaying an indication identifier, including a display device, a mobile terminal, and a control device; the control device comprises the processing means for displaying the indication mark according to any one of the second to sixth embodiments of the second aspect; the display device has a presentation area for displaying information; the mobile terminal and the display device can respectively carry out data transmission with the control device through a communication interface.
According to the method, the device and the system for controlling the indication marks, the indication direction of the mobile terminal and the associated position between the indication direction and the information presenting area are determined through the acquired current scene image, the indication marks displayed on the information presenting area through the mobile terminal are achieved, a user can conveniently use any mobile terminal to make an indication on the information presenting area at any time, and the method is good in universality, convenient to operate and convenient to use, and brings convenience to work and study of the user.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a flowchart of a first embodiment of a processing method for displaying an indication identifier according to the present invention;
FIG. 2 is a flowchart of a second embodiment of a processing method for displaying an indication identifier according to the present invention;
fig. 3 is a schematic view of an application scenario in the second embodiment of the present invention;
fig. 4 is a schematic diagram of a mobile terminal according to a second embodiment of the present invention;
fig. 5 is a schematic diagram of another mobile terminal according to a second embodiment of the present invention;
FIG. 6 is a flowchart of a third embodiment of a processing method for displaying an indication identifier according to the present invention;
fig. 7 is a schematic view of an application scenario in the third embodiment of the present invention;
FIG. 8 is a diagram of a first embodiment of a processing device displaying an indicator according to the present invention;
FIG. 9 is a diagram showing a second embodiment of a processing apparatus for displaying indication marks according to the present invention;
FIG. 10 is a diagram of a first embodiment of a processing system displaying an indicator according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart of a first embodiment of a processing method for displaying an indication identifier according to the present invention. The method of the present embodiment can be applied to cases like projection display. In projection displays, a projector and a projection screen are generally provided, the area on which information is displayed being the presentation area, and the projector is generally connected to a control device having a control function, such as a computer. A live image capture device and a mobile terminal are also provided. The field image capturing device is, for example, a camera, and may be a stand-alone camera whose captured image is supplied to the control device, or the control device may be one that integrates an image capturing function. The mobile terminal is typically a mobile phone, the user uses the mobile phone to perform an indication action, and then the control device may perform display of the indication mark in the imaging area according to the indication action performed by the user using the mobile phone.
The method of this embodiment may be specifically implemented by a processing device that displays the indication identifier, where the processing device may be implemented in a hardware and/or software manner, and is preferably arranged in the control device to implement display of the indication identifier. As shown in fig. 1, the method includes:
s101, obtaining a mobile terminal held by a human hand and a current scene image of the human hand and a presentation area.
Taking the above-mentioned case of projection display as an example, the user's hand holds the mobile terminal and makes an indication action (for example, if the user is a part of a machine that the instructor needs to explain the current display on the projection screen to the attendee, then the user holds the mobile terminal with the hand and points to the part, i.e. makes an indication action to expect that an indication mark can be displayed at the part), at this time, the field image capturing device is controlled by the processing device that displays the indication mark, and the obtained current scene image includes the mobile terminal held by the hand, and the projection screen that is the image generation area.
S102, determining the indication direction of the mobile terminal pointing to the imaging area according to the current scene image;
the field image capturing equipment provides the current scene image to a processing device for displaying the indication mark, and the processing device performs image processing analysis according to the collected current scene image and determines the indication direction of the mobile terminal pointing to the imaging area; when the indication direction is determined, the processing device may obtain the current position of the mobile terminal through image analysis, perform feature analysis, specifically, determine the current position of the mobile terminal by using a preset coordinate system, perform operation processing in the coordinate system, and determine the indication direction representing the mobile terminal. For example, the mobile terminal is a mobile phone, the image area for displaying information is a projection screen, the projection screen currently displays a mechanical picture, a certain distance exists between an interpreter and the projection screen, the interpreter needs to coordinate with the interpretation content to indicate each part in the picture in the process of interpreting the picture, the interpreter can hold the mobile phone with a hand through visual observation and point to the direction of the part needing to be interpreted currently, and the processing device displaying the indication mark can perform image processing analysis according to the acquired current scene image to determine the current indication direction of the mobile phone, that is, determine the direction indicated by the user through the mobile phone currently.
S103, determining a related position between the indication direction and the presenting area in the presenting area according to the range of the indication direction and the presenting area in the current scene image;
the presentation area for displaying information may be various areas available for displaying information, for example, in a meeting room where a projector and a projection screen are arranged, the presentation area is the projection screen, and the content displayed by the projection screen is obtained by projecting a computer display screen facing the projection screen; if the electronic screen is disposed in a studio, the image area is the electronic screen, such as a Light Emitting Diode (LED) display. The range of the presentation area, which can be defined by coordinates, can be made known to the processing means by presetting, manual adjustment or automatic identification.
The imaging area and the mobile terminal have a correlation relation in spatial position, and if the correlation relation is the distance, the angle and the like between the mobile terminal and the imaging area, the correlation position between the mobile terminal and the imaging area can be determined according to the correlation relation; for example, the mobile terminal is usually in a regular geometric shape, the processing device may use a direction of a central axis of a side surface of the mobile terminal (e.g., a side surface of a mobile phone where a display screen of the mobile phone is located) as the indication direction, and an intersection point necessarily exists between the central axis and a plane where the imaging area is located, where the intersection point is the relevant position.
And S104, displaying the indication mark at the associated position according to the display signal provided by the mobile terminal.
The indication mark is formed by the processing device according to a display signal provided by the mobile terminal, for example, a human hand holds the mobile terminal and points to the imaging area to expect to display the indication mark at a certain position on the imaging area, and then the processing device displays the indication mark on the imaging area according to the relative position of the mobile terminal and the imaging area. The presentation area of the information is, for example, the projection screen, and the instructor currently desires to have an indicator, such as a red dot, the explaining personnel makes an indicating action to the position of the central point of the projection curtain through the mobile terminal held by hands, the processing means determines an accurate associated position (which may coincide with the center point of the projection screen on the projection screen or may have a certain distance difference with the center point of the projection screen due to human eye viewing errors) through S101 to S103, and displays a red dot at the associated position, namely realizing the indicating mark, if the indicating mark has a distance difference with the center point expected by the explaining person, the explaining person holds the mobile terminal by hands to carry out movement adjustment, the processing device then executes S101 to S104 again, and the instruction mark is displayed at the center point.
In the processing method for displaying the indication identifier provided in this embodiment, the indication direction of the mobile terminal and the associated position between the indication direction and the information presentation area are determined by performing image analysis on the acquired current scene image, and the indication identifier is displayed on the information presentation area through the mobile terminal, so that a user can conveniently use any mobile terminal to make an indication on the information presentation area at any time, that is, the method has good universality and is convenient and fast to operate, and convenience is brought to the work and study of the user.
Fig. 2 is a flowchart of a second embodiment of a processing method for displaying an indication identifier according to the present invention. As shown in fig. 2, the present embodiment further optimizes the process of image processing based on the above embodiments, and the method includes:
s201, determining a standard three-dimensional coordinate system.
Fig. 3 is a schematic view of an application scenario in the second embodiment of the present invention. As shown in fig. 3, the presentation area for displaying information is an electronic display screen 20 and the live image capturing device 50 is a stereoscopic device that captures live images using dual cameras. The processing device for displaying the indication mark can control the display of the electronic display screen, and can also receive the images acquired by the stereoscopic vision device and process the images. Since the processing device performs the orientation analysis of the mobile terminal 30 according to the scene image collected by the field image capturing device, it is necessary to perform the orientation analysisDetermining a standard three-dimensional coordinate system O at a perspective of a live image capture devicec-XcYcZcAnd processing the relevant parameters of the image, the specific determination method is as follows:
s1a, acquiring calibration parameter information according to a preset sampling detection point;
in practical application, a Zhangyingyou calibration method can be used for acquiring calibration parameter information; specifically, a sampling detection point is preset in a scene, such as a black-and-white chessboard with a known fixed size is placed in the scene, and a scene image capturing device is made to capture a scene image including the sampling detection point; obtaining calibration parameter information by performing image analysis processing on an image captured by a field image capturing device, the calibration parameter information including an internal parameter and a structural parameter; wherein the intrinsic parameters include focal length, imaging distortion parameters, etc. used by the field image capture device, and the structural parameters include a three-dimensional translation matrix and a three-dimensional rotation matrix between the two cameras, the translation matrix beingA rotation matrix ofWherein the elements t in the matrix are shiftedx、ty、tzAnd the element r in the rotation matrix1To r9The value of (b) is obtained by subjecting the captured image to image analysis processing.
S1b, determining the standard three-dimensional coordinate system according to the calibration parameter information;
in particular, the standard three-dimensional coordinate system can be determined using the above-described structural parameters.
And S1c, determining a three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system according to the calibration parameter information and the standard three-dimensional coordinate system.
The position of the imaging area, such as an electronic display screen, is fixed in the application scene, so that after the standard three-dimensional coordinate system is determined, the three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system can be determined. Specifically, when the field image capturing device shoots the electronic display screen, the processing device performs distortion correction on the shot image according to the internal parameters, and then performs image processing on the corrected image to obtain a three-dimensional area equation of the electronic display screen in a standard three-dimensional coordinate system.
It should be noted that the above-mentioned S201 is executed when the processing method of the display instruction indicator of the present invention is first implemented in the application scene, and is directly executed from S202 when the processing method of the display instruction indicator is implemented again after the standard three-dimensional coordinate system, that is, the relevant parameters, are determined.
S202, acquiring the mobile terminal and the hand held by the hand, and the current scene image of the imaging area.
And S203, determining a characteristic plane of the mobile terminal held by the hand according to the current scene image.
Specifically, the processing device for displaying the indication identifier identifies the position of the hand in the image according to preset hand identification features, such as skin color features of a person and the outline of the hand, in the acquired current scene image; or the field image capturing equipment integrates the function of the infrared thermal imager, senses the imaging of the human body temperature at the hand position in the image, determines the position of the human hand in the image and further identifies the mobile terminal held by the human hand.
Based on the position of the human hand, an edge detection operator can be used to determine the feature plane of the mobile terminal held by the human hand, and specifically, fig. 4 is a schematic diagram of the mobile terminal in the second embodiment of the present invention. As shown in fig. 4, the mobile terminal is a common mobile phone, and the space area occupied by the mobile phone is overlapped with the space area occupied by the human hand, so that the human hand can be used as a detection reference point to detect the mobile phone within a certain detection range; for example, after the position of the human hand is determined, step change of the image is found near the finger area of the human hand, namely, an edge detection operator (such as a Canny edge operator, a gradient operator, a Mark operator and the like) is adopted to detect the outline of the mobile phone; and further determining the indication direction of the mobile phone according to the outline of the mobile phone.
Generally, a mobile phone is in a rectangular cubic structure, and the thickness of the mobile phone is smaller than the length and width of the front side or the back side of the mobile phone, so that the mobile phone can be regarded as a rectangular flat plate during image processing, and the detected outline of the mobile phone is an approximately rectangular figure, that is, for the mobile phone, the feature plane of the mobile terminal is a geometric plane figure with a central axis of the mobile terminal, that is, a rectangular side of the mobile phone.
More flexibly, fig. 5 is a schematic diagram of another mobile terminal according to a second embodiment of the present invention. As shown in fig. 5, the feature plane is a geometric plane figure with a central axis displayed by the mobile terminal; for example, if the instructor wishes to use a mobile phone to indicate information on an electronic display screen in a conference room, an application program in the mobile phone may be started, the application program may display a geometric planar figure with a central axis, such as a rectangle, on the display screen of the mobile phone, and the edge detection operator may be used to detect the outline of the rectangular image displayed on the display screen of the mobile phone.
In conclusion, the geometric figure with the central axis on the mobile terminal can be flexibly selected to be regarded as the characteristic plane of the mobile terminal.
And S204, determining a central axis of the characteristic plane of the mobile terminal as an indication direction according to the characteristic plane of the mobile terminal.
Taking a geometric planar graph with a central axis of the mobile terminal with a feature plane as shown in fig. 4 as an example, linear equations L1, L2, and L3 of the profile of the mobile phone are detected by using a linear detection algorithm; the straight line detection algorithm can be random Hough (Hough) transformation and other straight line detection algorithms obtained by improvement based on Hough transformation; specifically, the Hough transform is taken as an example, and the principle is that the detection problem in the image space is converted into the parameter space by using the point-line pair duality of the image space and the Hough parameter space so as to detect the straight line in the image. It should be noted that the current scene image acquired in S202 is a planar image, so that L1, L2, and L3 at this time are two-dimensional linear equations in this planar image, and L1, L2, and L3 collectively represent a two-dimensional area equation of the feature plane; and finally, performing least square fitting operation according to the three-dimensional coordinates of the plurality of selected feature points on each two-dimensional straight line to obtain a three-dimensional area equation of the feature plane under the standard three-dimensional coordinate system, namely obtaining the three-dimensional area equation of the feature plane under the standard three-dimensional coordinate system according to the two-dimensional area equation.
Specifically, in the present embodiment, the field image capturing apparatus employs two cameras, and the same point in the scene has a difference between two images taken by the two cameras; therefore, 5 feature points are selected from each two-dimensional straight line (the number of the feature points may also be set according to the accuracy requirement, and the specific number is not limited in this embodiment), each feature point of the 5 feature points also has a difference in two images respectively captured by two cameras, the 5 feature points in the two images are respectively one-to-one corresponding to each other by using a gray-based region matching method, the three-dimensional coordinate of the feature point on each straight line in the standard three-dimensional coordinate system is determined by combining the three-dimensional rotation matrix and the three-dimensional translation matrix in the structural parameters, and the specific calculation formula for determining the three-dimensional coordinate (x, y, z) is as follows:
x=zXl/fl
y=zYl/fl
or,
wherein f islIs the effective focal length of the left camera in the binocular camera, frThe effective focal length of the right camera in the binocular camera; (X)r,Yr) Two-dimensional coordinates of the feature points in the right camera; (X)l,Yl) Two-dimensional coordinates of the feature points in the left camera are obtained; for example, 5 feature points A, B, C, D, E are selected on the two-dimensional straight line represented by the above two-dimensional straight line equation L1, and the positions of the feature points A, B, C, D, E in the image captured by one camera are a1, B1, C1, D1, and E1, respectively; and positions a2, B2, C2, D2, E2 in the image taken by the other camera, respectively; a1 and A2, B1 and B2, C1 and C2, D1 and D2, and E1 and E2 are in one-to-one correspondence by adopting a gray-based region matching method, and the three-dimensional coordinates of A, B, C, D, E in a three-dimensional coordinate system are respectively determined by using the calculation formula for determining the three-dimensional coordinates (x, y, z). After determining the three-dimensional coordinates of each feature point on each two-dimensional straight line in the standard three-dimensional coordinate system, performing least square fitting operation according to the three-dimensional coordinates of a plurality of feature points on the same two-dimensional straight line in the three-dimensional coordinate system to obtain a three-dimensional straight line equation of the straight line in the standard three-dimensional coordinate system; that is, the three-dimensional linear equations L1 ', L2' and L3 'of the above L1, L2 and L3 in the standard three-dimensional coordinate system are obtained, so that the three-dimensional linear equations L1', L2 'and L3' can collectively represent the three-dimensional region equation of the above geometric figure having the central axis in the standard three-dimensional coordinate system.
According to the three-dimensional area equations of the geometric figure with the central axis in the standard three-dimensional coordinate system, namely three-dimensional linear equations L1 ', L2 ' and L3 ', a three-dimensional linear equation Lm of the central axis of the geometric figure with the central axis in the standard three-dimensional coordinate system is determined as the indication direction of the mobile terminal.
It should be noted that, geometrically, there is only one central axis for a graphic such as an isosceles triangle, but there are two central axes for a rectangle, so when the geometric graphic with central axes has at least two central axes, the processing device needs to further obtain the current indication direction of the built-in electronic compass of the mobile terminal, so as to select a first central axis parallel to the current indication direction of the electronic compass from the at least two central axes of the feature plane of the mobile terminal according to the current indication direction of the electronic compass, and determine a three-dimensional linear equation of the first central axis in the standard three-dimensional coordinate system as the indication direction according to the three-dimensional area equation of the feature plane in the standard three-dimensional coordinate system.
The method for determining the three-dimensional linear equation Ln of the central axis of the feature plane in the standard three-dimensional coordinate system as the indication direction based on the feature plane shown in fig. 5 is consistent with the above process, and is not repeated here.
S205, according to the indication direction and the range of the information presenting area, the related position between the indication direction and the presenting area in the presenting area is determined.
Since the three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system is known in S201; calculating according to a three-dimensional linear equation (Lm or Ln) of the central axis of the characteristic plane in a standard three-dimensional coordinate system and a three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system to obtain an intersection point coordinate in the standard three-dimensional coordinate system, determining the intersection point coordinate as a correlation position, namely performing solution operation on the three-dimensional linear equation of the central axis of the characteristic plane in the standard three-dimensional coordinate system and the three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system to obtain a solution value as the three-dimensional coordinate of the intersection point.
S206, judging whether a display signal sent by the mobile terminal is received or not; if yes, go to step S207, otherwise stop.
When a plurality of mobile terminals are in a scene (for example, a participant holds a mobile phone by hands), the device can determine the associated position between the indication direction of the mobile terminal and the imaging area corresponding to each mobile terminal, but if the indication mark of the mobile terminal is displayed corresponding to each mobile terminal, the indication confusion is easy to occur on the imaging area, so that the processing device needs to receive the display signal sent by the mobile terminal before controlling the indication mark to be displayed at the associated position; when the display signal sent by the mobile terminal is judged to be received, displaying an indication mark at the corresponding associated position of the mobile terminal; if the display message is not received, the indication mark is not displayed; it is understood that the processing device keeps track of the pointing direction of the mobile terminal and determines the associated position between the current pointing direction of the mobile terminal and the presentation area in real time, so that the pointing identifier can be displayed at the associated position immediately after receiving the display signal sent by the mobile terminal.
Taking a conference scene as an example, 5 participants have one mobile phone, each of the 5 participants carries one mobile phone, and only one of the 5 participants is used as an information interpreter, only an indication mark based on the mobile phone of the interpreter is needed to be displayed, so that the interpreter starts an application program which is loaded on the mobile phone and is adapted to the processing method of the embodiment, a display button appears on a display screen of the mobile phone, the button is clicked to trigger the mobile phone to send a signal wave to the processing device, the signal wave can be understood as a display signal sent by the mobile phone to the processing device, and the signal wave is flexible and can also be triggered and sent by pressing a physical key on the mobile phone; when the processing device controls the indication identifier to be displayed, that is, a red dot is displayed at a relevant position between the indication direction of the mobile phone of the interpreter and the projection screen (the red dot is only used for explaining the method of the embodiment, specific display shape or color, and the embodiment is not limited), if the interpreter finds that the red dot is not at a desired position, the interpreter continuously presses the display button and moves the mobile phone, and the processing device acquires the relevant position between the indication direction of the mobile phone and the imaging area in the current scene in real time, so that the red dot is displayed at the current relevant position.
It can also be understood that, if only the instructor holds the mobile phone in the conference scene, the processing device may determine the number of the mobile phones in the conference scene according to the received feedback message sent by the mobile phone after detecting that there is only one mobile phone in the field (for example, the processing device sends out the detection wave receivable by the mobile phone in the conference scene through a certain communication signal, or the mobile terminal may control the indicator lamp on the mobile terminal to flash by turning on the application program loaded on the mobile terminal, indicating to the processing device that the mobile terminal will make an indication action pointing to the imaging area, it can be understood that other mobile terminals that do not turn on the application program to flash the indicator lamp on the mobile terminal will not track the indication direction of the mobile terminal in real time), and display the indication identifier at the associated position between the indication direction of the mobile phone and the imaging area by default, s207 may be directly performed after S205 is performed without performing the determination of whether the display message indicating the identifier sent by the mobile terminal is received in S206.
And S207, displaying an indication mark at the associated position according to the display signal provided by the mobile terminal.
And controlling the electronic display screen to display the indication marks at the corresponding positions according to the three-dimensional coordinates of the intersection points determined in the S205.
According to the indication mark control method provided by the embodiment, the indication direction of the mobile terminal and the associated position between the indication direction and the imaging area of the information are determined through the acquired current scene image, the indication mark is displayed on the imaging area through the mobile terminal, and a user can conveniently use any mobile terminal to indicate on the imaging area at any time, namely the method is good in universality and convenient to operate, and brings convenience to work and study of the user.
Fig. 6 is a flowchart of a third embodiment of a processing method for displaying an indication identifier according to the present invention. As shown in fig. 6, on the basis of the first embodiment, this embodiment may further optimize the image processing process, where the method includes:
s301, determining a standard three-dimensional coordinate system.
Specifically, fig. 7 is a schematic view of an application scenario in the third embodiment of the present invention. As shown in fig. 7, the information is displayed by a projection screen 21 and a computer, wherein the content on the projection screen is obtained by projecting a computer screen 22 of the computer, the live image capture device 50 is a stereoscopic vision device, wherein the processing device for displaying the indication mark is responsible for processing the image obtained by the stereoscopic vision device, and the display of the content on the display of the projection screen is controlled by the computer, but the processing device for displaying the indication mark can control the display screen of the computer to display.
Similar to S201, the standard three-dimensional coordinate system and the related parameters of the processed image under the viewing angle of the field image capturing device need to be determined, and the specific determination method is as follows:
s3a, acquiring calibration parameter information according to the preset sampling detection points;
the method of acquiring the scaling parameter information in the present embodiment is similar to step S1a in S201 of the second embodiment, but in the present embodiment, the scaling parameter information includes parameters for coordinate system conversion in addition to intrinsic parameters and structural parameters, and specifically, the standard three-dimensional coordinate system O based on the viewing angle of the live image capture device is determined according to the zhangnyou scaling methodc-XcYcZcAnd a relative three-dimensional coordinate system O based on the viewing angle of the projection screenps-XpsYpsZpsThe conversion parameter between, and the two-dimensional coordinate system O of the computer display screens-XsYsAnd a relative three-dimensional coordinate system O of the projection screenps-XpsYpsZpsThe conversion parameter between.
S3b, determining a standard three-dimensional coordinate system according to the calibration parameter information;
step S3b in this embodiment is similar to step S1b in embodiment S201, and is not repeated here.
And S3c, determining a three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system according to the scaling parameter information and the standard three-dimensional coordinate system.
Step S3c in this embodiment is similar to step S1c in embodiment S201, and is not repeated here.
And S3d, determining the conversion relation between the coordinate systems according to the conversion parameters.
Specifically, the conversion relation between the standard three-dimensional coordinate system and the relative three-dimensional coordinate system is determinedIn which Xc、YcAnd ZcRepresents the X, Y and Z axes in a standard coordinate system, and Xps、YpsAnd ZpsRepresenting the X, Y and Z axes in the above mentioned relative coordinate system, R being a rotation matrix of 3 × 3 and T being a translation matrix of 3 × 1, wherein the assignment of elements in the matrix is determined on the basis of field measurements, i.e. determined by step S3a, and further determining the transformation between the two-dimensional coordinate system of the computer screen and the relative three-dimensional coordinate system of the projection screen Wherein, XSAnd YSConverting the parameter A for X-axis and Y-axis in two-dimensional coordinate system of computer screen1To A8Is determined in step S3 a.
It should be noted that the above-mentioned S301 is executed when the processing method of the present invention is first implemented in an application scenario, and when the processing method of displaying the indication mark is implemented again after the relevant parameters of the standard three-dimensional coordinate system are determined, the execution is directly started from S302.
S302, acquiring the current scene image including the mobile terminal and the hand held by the hand and the imaging area
And S303, determining a feature plane of the mobile terminal according to the current scene image.
S304, determining a central axis of the characteristic plane of the mobile terminal held by the hand as an indication direction according to the characteristic plane of the mobile terminal.
S305, determining the related position between the indication direction and the presentation area in the presentation area according to the indication direction and the range of the presentation area of the information.
S306, judging whether a display signal sent by the mobile terminal is received; if yes, go to step S307, otherwise stop.
In this embodiment, S302 to S306 are similar to the second embodiment, and refer to S202 to S206, which are not described herein again.
S307, displaying an indication mark at the relevant position according to the display signal provided by the mobile terminal.
The associated position determined in S305 is determined in the standard three-dimensional coordinate system, that is, the processing device determines the intersection point between the central axis of the feature plane of the mobile terminal and the projection screen in the standard three-dimensional coordinate system, but since the projection screen displays the content on the computer display screen, the associated position display indicator on the projection screen needs to display the indicator on the computer display screen at the position corresponding to the associated position; therefore, when the processing device for displaying the indication mark displays the indication mark at the associated position, the processing device further includes the following steps:
and S4a, converting the coordinate system of the related position.
Specifically, in S305, the associated position in the standard three-dimensional coordinate system is determined, that is, the coordinate of the intersection point between the central axis of the feature plane of the mobile terminal and the projection screen is (X)c1,Yc1,Zc1) (ii) a According toTo (X)c1,Yc1,Zc1) Performing conversion to obtain the relative position of the associated positionCoordinates (X) in a coordinate systemps1,Yps1,Zps1) (ii) a Because the computer display screen adopts a two-dimensional coordinate system, the conversion relation between the two-dimensional coordinate system of the computer screen and the relative three-dimensional coordinate system of the projection screen is determined Will (X)ps1,Yps1) Converting to obtain (X)s1,Ys1)。
S4b, sending a control display message to the display device.
The processing means carries a position indication message, i.e. coordinates (X), when sending a control display message to a computer in the display devices1,Ys1) For example, the computer is based on (X)s1,Ys1) Determination of (X)s1,Ys1) And if the indicated position is the central point of the computer display screen, the computer controls the computer display screen to display a red dot at the central point, so that the display position of the red dot at the central point of the computer display screen on the projection screen after projection is matched with the determined associated position in the S305, and the display of the indication mark is realized.
More flexibly, the processing device displaying the indication flag may not execute S4a, and directly execute S4 b; namely the intersection point coordinate (X) between the central axis of the characteristic plane of the mobile terminal and the projection screenc1,Yc1,Zc1) Directly sending to the computer, and then converting (X) by the computer according to each coordinate system conversion formulac1,Yc1,Zc1) Conversion to (X)s1,Ys1) And according to (X)s1,Ys1) If the indication mark is displayed at the corresponding position of the computer display screen, the indication mark is also projected to the associated position determined by S304 in the projection screen.
The indication identifier control method provided in this embodiment determines, through the acquired current scene image, the indication direction of the mobile terminal and the associated position between the indication direction and the imaging area of the information, and realizes that the indication identifier is displayed on the imaging area through the mobile terminal, so that a user can conveniently use any mobile terminal to make an indication on the imaging area of the information at any time, that is, the method has good universality and is convenient and fast to operate, and convenience is brought to the work and study of the user.
FIG. 8 is a diagram of a first embodiment of a processing device for displaying an indication mark according to the present invention. As shown in fig. 8, the apparatus includes:
the system comprises an acquisition module 10, a display module and a display module, wherein the acquisition module is used for acquiring a mobile terminal held by a human hand and a current scene image of the human hand and a presentation area;
a first determining module 11, configured to determine, according to the current scene image, a pointing direction in which the mobile terminal points to the presentation area;
a second determining module 12, configured to determine, according to the indication direction and a range of the presentation area in the current scene image, an associated position between the indication direction and the presentation area, where the associated position is located in the presentation area;
and the control module 13 is configured to display an indication identifier at the associated position according to the display signal provided by the mobile terminal.
The processing device for displaying the indication identifier determines the indication direction of the mobile terminal and the associated position between the indication direction and the information presenting area through the acquired current scene image, so that the indication identifier is displayed on the information presenting area through the mobile terminal, a user can conveniently use any mobile terminal to make an indication on the information presenting area at any time, namely the device is good in universality and convenient to operate, and convenience is brought to work and study of the user.
FIG. 9 is a block diagram of a second embodiment of a processing device for displaying an indication mark according to the present invention. As shown in fig. 9, the first determining module 11 specifically includes:
the hand recognition unit 111 is used for determining the position of a hand in the current scene image according to preset hand recognition characteristics;
a feature plane determining unit 112, configured to determine, based on a position of the human hand, a feature plane of the mobile terminal held by the human hand by using an edge detection operator, where the feature plane is a geometric plane figure of the mobile terminal having a central axis, or the feature plane is a geometric plane figure of the mobile terminal having a central axis and controlled and displayed by the mobile terminal;
the indication direction determining unit 113 is further configured to determine, according to the feature plane of the mobile terminal, a central axis of the feature plane of the mobile terminal as the indication direction.
Further, the processing apparatus further includes:
and the receiving module 14 is configured to receive a display signal sent by the mobile terminal.
The indication direction determining unit 113 specifically includes: the first equation determining subunit 113a is specifically configured to determine a two-dimensional area equation of the geometric planar graph with the central axis by using a linear detection algorithm;
the second equation determining subunit 113b is further configured to obtain, according to the two-dimensional region equation, a three-dimensional region equation of the geometric planar graph with the central axis in the standard three-dimensional coordinate system;
the indication direction determining subunit 113c is further configured to determine, as the indication direction, a three-dimensional linear equation of the central axis of the geometric planar figure with the central axis in the standard three-dimensional coordinate system according to a three-dimensional region equation of the geometric planar figure with the central axis in the standard three-dimensional coordinate system.
Furthermore, when the geometric plane figure with the central axis has at least two central axes; the pointing direction determining subunit 113c is further configured to obtain a current pointing direction of a built-in electronic compass of the mobile terminal;
selecting a first central axis parallel to the current indication direction of the electronic compass from at least two central axes of the geometric plane graph with the central axes, and determining a three-dimensional linear equation of the first central axis in a standard three-dimensional coordinate system as the indication direction according to a three-dimensional area equation of the feature plane in the standard three-dimensional coordinate system.
Further, the second determining module 12 is specifically configured to perform an operation according to a three-dimensional linear equation of the central axis of the geometric planar graph with the central axis in the standard three-dimensional coordinate system and a three-dimensional area equation of the imaging area in the standard three-dimensional coordinate system, obtain an intersection coordinate in the standard three-dimensional coordinate system, and determine the intersection coordinate as the associated position.
Flexibly, corresponding to the third embodiment of the method, the processing apparatus further includes a sending module 15, configured to send a control display message to the display device.
Further, the processing apparatus further includes:
the third determining module 16, before determining the indication direction of the mobile terminal according to the current scene image, acquires the calibration parameter information according to the preset sampling detection point; determining the standard three-dimensional coordinate system according to the calibration parameter information; and determining a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system according to the calibration parameter information and the standard three-dimensional coordinate system.
The modules in the device embodiments correspond to the method embodiments, and are not described herein again.
The processing device for displaying the indication identifier determines the indication direction of the mobile terminal and the associated position between the indication direction and the information presenting area through the acquired current scene image, so that the indication identifier is displayed on the information presenting area through the mobile terminal, a user can conveniently use any mobile terminal to make an indication on the information presenting area at any time, namely the device is good in universality and convenient to operate, and convenience is brought to work and study of the user.
Further, fig. 10 is a schematic diagram of a first embodiment of a processing system for displaying an indicator according to the present invention. As shown in fig. 10, the system includes a display device 60, a mobile terminal 30 and a control device 40, wherein the display device 60 has a presentation area for displaying information, and may be specifically configured by the projection curtain 21 and the computer 22; the electronic display screen 20 can be a separate electronic display screen 20, and the content displayed on the electronic display screen 20 is controlled and displayed by the control device 40; the control device 40 includes the processing means for displaying the indication mark described in the first and second embodiments of the processing means for displaying the indication mark. In addition, the control device 40 may integrate the functions of the above-described live image capture device 50, and may also receive live image images transmitted by the separate live image capture device 50; when the control device 40 determines the associated position between the pointing direction of the mobile terminal 30 and the presentation area after analyzing the live scene image, the control device 40 controls the display device 20 to display the pointing flag at the associated position.
Flexibly, after the control device 40 determines the associated position, before the control display device 20 displays the indication identifier at the associated position, it needs to determine whether a display message of the indication identifier sent by the mobile terminal 30 is received, and if so, the control display device 20 displays the indication identifier at the associated position; otherwise, not displaying; when a plurality of mobile terminals exist in a scene, the display control of the indicator is performed by judging whether the display message of the indicator sent by a certain mobile terminal 30 is received, so that the situation that the display of the indicator is disordered in the information display area is avoided.
The indication mark control system provided in this embodiment determines, through the acquired current scene image, the indication direction of the mobile terminal and the associated position between the indication direction and the imaging area of the information, and realizes the indication mark of the imaging area of the information through the mobile terminal, so that a user can conveniently use any mobile terminal to make an indication on the imaging area of the information at any time, that is, the system has good universality and is convenient and fast to operate, and convenience is brought to the work and study of the user.
Those of ordinary skill in the art will understand that: all or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The program may be stored in a computer-readable storage medium. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present invention.

Claims (13)

1. A processing method for displaying an indication mark is characterized by comprising the following steps:
acquiring a mobile terminal held by a hand, the hand and a current scene image of a presentation area;
determining the indication direction of the mobile terminal pointing to the presentation area according to the current scene image;
determining a correlation position between the indication direction and the presentation area in the presentation area according to the indication direction and the range of the presentation area in the current scene image;
displaying an indication mark at the associated position according to a display signal provided by the mobile terminal;
the determining, according to the current scene image, the pointing direction of the mobile terminal to the presentation area includes:
determining the position of the human hand in the current scene image according to preset human hand recognition characteristics;
determining a feature plane of the mobile terminal held by the hand based on the position of the hand by using an edge detection operator, wherein the feature plane is a geometric plane figure of the mobile terminal with a central axis, or the feature plane is a geometric plane figure of the mobile terminal with the central axis;
and determining a central axis of the characteristic plane of the mobile terminal as the indication direction according to the characteristic plane of the mobile terminal.
2. The processing method according to claim 1, wherein before displaying the indication mark at the associated position according to the display signal provided by the mobile terminal, the processing method further comprises:
and receiving the display signal sent by the mobile terminal.
3. The processing method according to claim 1, wherein determining a central axis of the feature plane of the mobile terminal as the indication direction according to the feature plane of the mobile terminal comprises:
determining a two-dimensional area equation of the geometric plane graph with the central axis by adopting a linear detection algorithm;
according to the two-dimensional area equation, acquiring a three-dimensional area equation of the geometric plane figure with the central axis in a standard three-dimensional coordinate system;
and determining a three-dimensional linear equation of the central axis of the geometric planar graph with the central axis in the standard three-dimensional coordinate system as the indication direction according to the three-dimensional region equation of the geometric planar graph with the central axis in the standard three-dimensional coordinate system.
4. The processing method according to claim 3, wherein determining a three-dimensional linear equation of the central axis of the geometric planar figure with the central axis in a standard three-dimensional coordinate system as the indication direction according to a three-dimensional region equation of the geometric planar figure with the central axis in the standard three-dimensional coordinate system comprises:
when the geometric plane figure with the central axis has at least two central axes; acquiring the current indication direction of a built-in electronic compass of the mobile terminal;
selecting a first central axis parallel to the current indication direction of the electronic compass from at least two central axes of the geometric planar graph with the central axes, and determining a three-dimensional linear equation of the first central axis in a standard three-dimensional coordinate system as the indication direction according to a three-dimensional area equation of the geometric planar graph with the central axes in the standard three-dimensional coordinate system.
5. The processing method according to claim 3 or 4, wherein determining the associated position between the indication direction and the presentation area in the presentation area according to the indication direction and the range of the presentation area in the current scene image comprises:
and calculating according to a three-dimensional linear equation of the central axis of the geometric plane figure with the central axis in the standard three-dimensional coordinate system and a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system to obtain an intersection point coordinate in the standard three-dimensional coordinate system, and determining the intersection point coordinate as the associated position.
6. The processing method according to claim 3, wherein before determining the pointing direction of the mobile terminal according to the current scene image, the method further comprises:
acquiring calibration parameter information according to a preset sampling detection point;
determining the standard three-dimensional coordinate system according to the calibration parameter information;
and determining a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system according to the calibration parameter information and the standard three-dimensional coordinate system.
7. A processing apparatus for displaying an indicator, comprising:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring a mobile terminal held by a human hand and a current scene image of the human hand and a presentation area;
a first determining module, configured to determine, according to the current scene image, an indication direction in which the mobile terminal points to the presentation area;
a second determining module, configured to determine, according to the indication direction and a range of the presentation area in the current scene image, an associated position, in the presentation area, between the indication direction and the presentation area;
the control module is used for displaying an indication mark at the associated position according to a display signal provided by the mobile terminal;
the first determining module includes:
the human hand recognition unit is used for determining the position of a human hand in the current scene image according to preset human hand recognition characteristics;
a feature plane determining unit, configured to determine, based on a position of the hand, a feature plane of the mobile terminal held by the hand by using an edge detection operator, where the feature plane is a geometric plane figure of the mobile terminal having an axis, or the feature plane is a geometric plane figure of the mobile terminal having an axis and displayed on the mobile terminal;
and the indication direction determining unit is used for determining a central axis of the characteristic plane of the mobile terminal as the indication direction according to the characteristic plane of the mobile terminal.
8. The processing apparatus according to claim 7, characterized in that the processing apparatus further comprises:
and the receiving module is used for receiving the display signal sent by the mobile terminal.
9. The processing apparatus according to claim 7, wherein the indication direction determining unit includes:
the first equation determining subunit is used for determining a two-dimensional area equation of the geometric planar graph with the central axis by adopting a linear detection algorithm;
the second equation determining subunit is used for acquiring a three-dimensional area equation of the geometric plane figure with the central axis in a standard three-dimensional coordinate system according to the two-dimensional area equation;
and the indication direction determining subunit is configured to determine, according to the three-dimensional region equation of the geometric planar graph with the central axis in the standard three-dimensional coordinate system, a three-dimensional linear equation of the central axis of the geometric planar graph with the central axis in the standard three-dimensional coordinate system as the indication direction.
10. The processing apparatus according to claim 9, wherein when the geometric planar figure having central axes has at least two central axes; the indication direction determining subunit is specifically configured to obtain a current indication direction of a built-in electronic compass of the mobile terminal; selecting a first central axis parallel to the current indication direction of the electronic compass from the at least two central axes of the geometric plane graph with the central axes, and determining a three-dimensional linear equation of the first central axis in a standard three-dimensional coordinate system as the indication direction according to a three-dimensional area equation of the feature plane in the standard three-dimensional coordinate system.
11. The processing apparatus according to claim 9 or 10, wherein the second determining module is specifically configured to obtain intersection coordinates in the standard three-dimensional coordinate system according to a three-dimensional straight-line equation of the central axis of the geometric planar figure with the central axis in the standard three-dimensional coordinate system and a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system, and determine the intersection coordinates as the associated position.
12. The processing apparatus according to claim 9, characterized in that the processing apparatus further comprises:
the third determining module is used for acquiring calibration parameter information according to a preset sampling detection point before determining the indication direction of the mobile terminal according to the current scene image; determining the standard three-dimensional coordinate system according to the calibration parameter information; and determining a three-dimensional region equation of the imaging region in the standard three-dimensional coordinate system according to the calibration parameter information and the standard three-dimensional coordinate system.
13. A processing system for displaying an indication mark is characterized by comprising display equipment, a mobile terminal and control equipment; the control equipment comprises a processing device for displaying an indication mark according to any one of claims 7-12; the display device has a presentation area for displaying information; the mobile terminal and the display device can respectively carry out data transmission with the control device through a communication interface.
CN201310354098.6A 2013-08-14 2013-08-14 Show processing method, the device and system of sign Active CN104375626B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201310354098.6A CN104375626B (en) 2013-08-14 2013-08-14 Show processing method, the device and system of sign
PCT/CN2014/070140 WO2015021746A1 (en) 2013-08-14 2014-01-06 Processing method, apparatus and system for displaying indicator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310354098.6A CN104375626B (en) 2013-08-14 2013-08-14 Show processing method, the device and system of sign

Publications (2)

Publication Number Publication Date
CN104375626A CN104375626A (en) 2015-02-25
CN104375626B true CN104375626B (en) 2017-10-17

Family

ID=52467986

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310354098.6A Active CN104375626B (en) 2013-08-14 2013-08-14 Show processing method, the device and system of sign

Country Status (2)

Country Link
CN (1) CN104375626B (en)
WO (1) WO2015021746A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105354820B (en) 2015-09-30 2018-05-22 深圳多新哆技术有限责任公司 Adjust the method and device of virtual reality image

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101639747A (en) * 2009-08-31 2010-02-03 广东威创视讯科技股份有限公司 Spatial three-dimensional positioning method
CN102354345A (en) * 2011-10-21 2012-02-15 北京理工大学 Medical image browse device with somatosensory interaction mode

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8591039B2 (en) * 2008-10-28 2013-11-26 Smart Technologies Ulc Image projection methods and interactive input/projection systems employing the same
CN102402680B (en) * 2010-09-13 2014-07-30 株式会社理光 Hand and indication point positioning method and gesture confirming method in man-machine interactive system
CN102568357A (en) * 2010-12-30 2012-07-11 鸿富锦精密工业(深圳)有限公司 Electronic device and method for conducting automatic exhibition guide in exhibition hall by utilizing same
CN202422028U (en) * 2011-12-28 2012-09-05 广州市唯昕电子科技有限公司 Laser writing device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101639747A (en) * 2009-08-31 2010-02-03 广东威创视讯科技股份有限公司 Spatial three-dimensional positioning method
CN102354345A (en) * 2011-10-21 2012-02-15 北京理工大学 Medical image browse device with somatosensory interaction mode

Also Published As

Publication number Publication date
CN104375626A (en) 2015-02-25
WO2015021746A1 (en) 2015-02-19

Similar Documents

Publication Publication Date Title
CN108886582B (en) Image pickup apparatus and focus control method
US9007400B2 (en) Image processing device, image processing method and computer-readable medium
CN101566875B (en) Image processing apparatus, and image processing method
CN106846410B (en) Driving environment imaging method and device based on three dimensions
WO2013146269A1 (en) Image capturing device, image processing method, and program
US20170076477A1 (en) Image Display Device, Image Display Method and Storage Medium
CN110035218B (en) Image processing method, image processing device and photographing equipment
JP6464281B2 (en) Information processing apparatus, information processing method, and program
CN110930463B (en) Method and device for calibrating internal reference of monitoring camera and electronic equipment
JP2012256110A (en) Information processing apparatus, information processing method, and program
WO2018028152A1 (en) Image acquisition device and virtual reality device
JP2013217662A (en) Length measuring device, length measuring method, and program
US20140168375A1 (en) Image conversion device, camera, video system, image conversion method and recording medium recording a program
US20180220066A1 (en) Electronic apparatus, operating method of electronic apparatus, and non-transitory computer-readable recording medium
JP2015194367A (en) Temperature measurement device, display processing program of measurement result and temperature measurement system
CN104375626B (en) Show processing method, the device and system of sign
JP5152281B2 (en) Image processing apparatus, method, and program
TWI603225B (en) Viewing angle adjusting method and apparatus of liquid crystal display
KR100690172B1 (en) method for extracting 3-dimensional coordinate information from 3-dimensional image using mobile phone with multiple cameras and terminal thereof
CN110120062B (en) Image processing method and device
CN113538700A (en) Augmented reality device calibration method and device, electronic device and storage medium
JP2012147059A (en) Image information processor and image information processing system
US10372287B2 (en) Headset device and visual feedback method and apparatus thereof
JP2020187557A (en) Temperature image display device, temperature image display system and temperature image display program
JP6067040B2 (en) Information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20210118

Address after: Room 400, building 5, No.11 Deshengmenwai street, Xicheng District, Beijing 100032 (Desheng Park)

Patentee after: Migu cultural technology Co., Ltd.

Address before: 518129 Bantian HUAWEI headquarters office building, Longgang District, Guangdong, Shenzhen

Patentee before: HUAWEI TECHNOLOGIES Co.,Ltd.

TR01 Transfer of patent right