[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN104933241A - Evaluation method for discomfort glare of train driving interface illumination - Google Patents

Evaluation method for discomfort glare of train driving interface illumination Download PDF

Info

Publication number
CN104933241A
CN104933241A CN201510317586.9A CN201510317586A CN104933241A CN 104933241 A CN104933241 A CN 104933241A CN 201510317586 A CN201510317586 A CN 201510317586A CN 104933241 A CN104933241 A CN 104933241A
Authority
CN
China
Prior art keywords
glare
glare source
mrow
source region
driving interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510317586.9A
Other languages
Chinese (zh)
Inventor
郭北苑
詹自翔
方卫宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Jiaotong University
Original Assignee
Beijing Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Jiaotong University filed Critical Beijing Jiaotong University
Priority to CN201510317586.9A priority Critical patent/CN104933241A/en
Publication of CN104933241A publication Critical patent/CN104933241A/en
Pending legal-status Critical Current

Links

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The invention discloses an evaluation method for discomfort glare of train driving interface illumination. The method comprises the following steps of establishing a three-dimensional model of a train driving interface, and obtaining a driver visual simulated image and parameters of all pixels in the image through an optical simulation tool; judging glare source pixels from all pixels according to a preset judgment condition, computing the average value of brightness values of non-glare source pixels in the image, and taking the average value as a background brightness value; integrating adjacent glare source pixels into a glare source region, and obtaining a plurality of glare source regions; dividing all the glare source regions into common glare source regions and small glare source regions according to area size, and collecting data of all the common glare source regions and small glare source regions separately; and computing unified glare rating (UGR) of the image according to an evaluation model, and evaluating the discomfort glare of the train driving interface illumination. Through adoption of the method, the train driving interface illumination glare can be evaluated in the early design phase, so that the cost is reduced, and the design cycle is shortened.

Description

Discomfort glare evaluation method for train driving interface illumination
Technical Field
The invention relates to the field of train driving interface lighting systems. And more particularly, to a discomfort glare evaluation method for train driving interface lighting.
Background
The design of the lighting system is an important component of the design of a driving interface, and plays a role in timely and accurately acquiring visual information, relieving visual fatigue and ensuring driving safety for a driver. Factors affecting the lighting quality of the driving interface include illuminance level, brightness distribution, illuminance uniformity, illuminance stability, glare, etc., wherein glare is the most dominant factor affecting the lighting quality. There is a close match between the design of the driving interface and the lighting, for example, the relative position of the lighting device and the driving interface, the curvature, shape, inclination, size of the windshield, the arrangement of the vehicle instrument and display device, and the choice of surface material may cause glare. How to evaluate the lighting glare is one of the preconditions for judging the lighting quality of the driving interface, and has very important significance for guiding the design and optimization of the train driving interface.
On the basis of the conventional glare evaluation method, the international commission on illumination defines glare as a visual condition causing discomfort or a reduced ability to distinguish detailed objects due to inappropriate distribution or range of brightness or too strong contrast, and classifies it into discomfort glare and disability glare according to visual states, wherein discomfort glare refers to glare that causes discomfort but does not necessarily impair target visibility, and controlling discomfort glare simultaneously allows sufficient control of disability glare. The research of the discomfort glare evaluation model starts in the last 20 th century, and in the following decades, various countries research and establish a model representing the relationship between discomfort glare influence factors and subjective feeling through an experimental method, and form evaluation models for indoor lighting discomfort glare such as a BGI model established by Petherbridge and Hopkingson in the uk, a VCP model established by Guth of american scholars, and the like. The literature carries out experimental research on the correlation between subjective feelings and several evaluation models of discomfort glare including BGI and VCP, and the results show that the correlation between the two models is poor, namely, the above system can not well reflect the subjective discomfort feelings of the discomfort glare. CIE adopted the south african scholars Einhorn modified glare index CGI once in 1983, and subsequently introduced a new unified glare rating formula to calculate unified glare value UGR in 1995. The correlation coefficient between the unified glare value UGR and the subjective discomfort feeling of glare obtained by subjective evaluation experiments in the literature reaches 0.95, so that the formula is considered as an evaluation model of indoor lighting discomfort glare with the most ideal evaluation effect. The establishment of the discomfort glare evaluation model makes it possible to evaluate the lighting glare of the driving interface by measuring the calculation parameters required by the UGR discomfort glare evaluation model.
In recent years, the trend of parameter centralization, intelligent display and vitrification of display devices in train driving interfaces is generated, so that the lighting environment is more complicated, and the possibility and complexity of generation of uncomfortable glare are greatly improved. The lighting discomfort glare evaluation by adopting a subjective experiment method and a method for measuring UGR glare evaluation model calculation parameters on site faces many problems, for example, when glare is evaluated by adopting an objective model, the boundary of a glare source is difficult to determine, so that model parameters cannot be determined, even if the boundary can be determined, the accuracy of model calculation parameter values is difficult to ensure by a measurement mode for glare regions with uneven brightness, and the more serious problem is that the traditional lighting glare evaluation method depends on a physical model, so that the design cannot be completed at the initial stage of driving interface design, which leads to a series of changes of a vehicle body structure, electrical equipment arrangement and the like when the lighting environment is improved, so that the manufacturing cost is increased and the design period is prolonged.
Therefore, it is desirable to provide a method for evaluating discomfort glare in train driving interface lighting.
Disclosure of Invention
The invention aims to provide an uncomfortable glare evaluation method for train driving interface illumination, and solves the problems that in the prior art, the boundary of a glare source generated by train driving interface illumination in a complex illumination environment is difficult to define, a traditional experimental evaluation method depends on a physical model, and the like.
In order to achieve the purpose, the invention adopts the following technical scheme:
a discomfort glare evaluation method for train driving interface illumination comprises the following steps:
s1, establishing a train driving interface three-dimensional model, and obtaining a driver visual simulation image and coordinate values, brightness values and illumination value parameters of each pixel point in the image through an optical simulation tool;
s2, judging glare source pixel points from the pixel points according to the set judgment condition, and calculating the average value of the brightness values of the non-glare source pixel points in the image to be used as a background brightness value;
s3, integrating adjacent glare source pixel points in each glare source pixel point into one glare source area to obtain a plurality of glare source areas;
s5, dividing each glare source region into a general glare source region and a small glare source region according to the area size, collecting the brightness value, the solid angle and the position index of a geometric center of each general glare source region, and collecting the luminous intensity, the distance from the geometric center to the eye point of a train driver and the position index of the geometric center of each small glare source region;
s6, calculating a unified glare value UGR of the image according to the train driving interface illumination discomfort glare evaluation model, and evaluating discomfort glare of train driving interface illumination according to the unified glare value, wherein the model formula of the train driving interface illumination discomfort glare evaluation model is as follows:
<math> <mrow> <mi>U</mi> <mi>G</mi> <mi>R</mi> <mo>=</mo> <mn>8</mn> <mi>log</mi> <mfrac> <mn>0.25</mn> <msub> <mi>L</mi> <mi>b</mi> </msub> </mfrac> <mrow> <mo>(</mo> <mrow> <munderover> <mo>&Sigma;</mo> <mrow> <mi>h</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>H</mi> </munderover> <mfrac> <mrow> <msup> <msub> <mi>L</mi> <mi>h</mi> </msub> <mn>2</mn> </msup> <msub> <mi>&omega;</mi> <mi>h</mi> </msub> </mrow> <msubsup> <mi>P</mi> <mi>h</mi> <mn>2</mn> </msubsup> </mfrac> <mo>+</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>q</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>Q</mi> </munderover> <mfrac> <mrow> <mn>200</mn> <msubsup> <mi>I</mi> <mi>q</mi> <mn>2</mn> </msubsup> </mrow> <mrow> <msubsup> <mi>r</mi> <mi>q</mi> <mn>2</mn> </msubsup> <msubsup> <mi>P</mi> <mi>h</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> </mrow> <mo>)</mo> </mrow> </mrow> </math>
in the formula, H is a general glare source regionThe total number of domains; q is the total number of the small glare source area; l ishLuminance value (cd/m2) of h-th general glare source region; l isbIs the background luminance value (cd/m2) of the image; omegahA solid angle (sr) being the h-th general glare source region; phIs the position index of the geometric center of the h-th general glare source area; i isqThe light intensity (cd) of the qth small glare source region in the sight direction of the train driver, rqDistance (m) from eyes for the qth small-glare source region; pqIs an index of the position of the geometric center of the qth small-glare source region.
Preferably, the method further comprises the following steps after the step S3 and before the step S5:
s4, screening out the glare source region formed by no more than 2-10 glare source pixel points.
Preferably, the determination conditions of the glare source pixel points in step S2 are: and taking the pixel points with the brightness value more than 4 times of the average brightness value of all the pixel points in the image as the glare source pixel points.
Preferably, the method for dividing each glare source region into a general glare source region and a small glare source region according to area size in step S5 is as follows: the projection area is more than or equal to 0.005m2The glare source area is used as a general glare source area, and the projection area is less than 0.005m2As a small glare source region.
The invention has the following beneficial effects:
the technical scheme of the invention defines the boundary of the glare source generated by the train driving interface illumination in the complex illumination environment, is based on digital model simulation and does not depend on a physical model, so that the train driving interface illumination glare can be evaluated at the initial stage of design, the cost is reduced, and the design period is shortened.
Drawings
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
Fig. 1 shows a flow chart of an uncomfortable glare evaluation method for train driving interface lighting.
Fig. 2 shows the coordinate definition of the position index expression in the model calculation parameters.
Detailed Description
In order to more clearly illustrate the invention, the invention is further described below with reference to preferred embodiments and the accompanying drawings. Similar parts in the figures are denoted by the same reference numerals. It is to be understood by persons skilled in the art that the following detailed description is illustrative and not restrictive, and is not to be taken as limiting the scope of the invention.
As shown in fig. 1, the discomfort glare evaluation method for train driving interface lighting provided by this embodiment includes the following steps:
step1, establishing a three-dimensional model of a train driving interface, and obtaining a driver visual simulation image and parameters such as coordinate values, brightness values, illumination intensity and the like of each pixel point in the image through an optical simulation tool;
step2, setting the judgment condition of the glare source pixel points, traversing each pixel point in the visual simulation image, judging the glare source pixel points from each pixel point according to the judgment condition of the glare source pixel points, collecting the brightness values of all other pixel points except the pixel points judged to be the glare source in the image, and calculating the average value of the brightness values as the background brightness value;
step3, integrating each independent glare source pixel point according to whether the glare source pixel points are adjacent or not, integrating adjacent glare source pixel points in each glare source pixel point into a glare source region to obtain a plurality of glare source regions,
step4, setting screening conditions, and screening out the glare source area meeting the screening conditions after integration according to the screening conditions, wherein the Step4 is an optional Step;
step5, judging the area of each glare source region, and judging the area as a general glare source region (the projection area is more than or equal to 0.005 m)2) The glare source region is judged as a small glare source region (the projection area is less than 0.005 m) according to UGR model acquisition and calculation parameters, namely the brightness value, the solid angle and the position index of the geometric center of a general glare source region2) The glare source area collects and calculates parameters, namely the luminous intensity of the small glare source area, the distance from the geometric center to the eye point of the train driver and the position index of the geometric center according to a small light source UGR correction model;
and Step6, inputting the parameters acquired in the process into a train driving interface illumination discomfort glare evaluation model to calculate the UGR value of the image, and evaluating the discomfort glare of the train driving interface illumination according to the UGR value and the subjective feeling relation table.
Wherein,
the specific process of Step1 is as follows:
taking a SPEOS CAA V5Based V16.1.1 optical simulation analysis module as an example, a train driving interface three-dimensional model is established in a CATIA tool, the model is simplified on the premise of ensuring the completeness of interior decoration, the optical properties of the driving interface model material are defined, and the optical properties of the material are collected by an OMS2 optical property measuring instrument and comprise the color of the material and information such as reflectivity, transmissivity, absorptivity and scattering of light rays with different wavelengths. And defining parameters of the illumination light source, including a spotlight, a console display, an instrument and the like. And constructing a driver vision detector to simulate driver vision according to the maximum direct vision of human eyes in the horizontal direction and the vertical direction, wherein the horizontal visual angle of a driver is 120 degrees, the vertical visual angle is 90 degrees, the sight line is kept horizontal, and the position of the eye point of the driver of the train is determined by referring to the UIC651 standard. The method comprises the steps of obtaining a simulation calculation result through a SPEOS CAA V5Based V16.1.1 optical simulation analysis module and outputting an XMP file, wherein the file records parameters such as a visual simulation image and coordinate values, brightness, illumination and the like of each pixel point in the image, and the coordinate values of each pixel point on the image reflect information of a three-dimensional space object after a driver eye point projects to an image plane through light of the pixel point, namely projection of the three-dimensional space on a plane image.
The specific process of Step2 is as follows:
in order to consider the adaptation of human eyes to the environment brightness, the average brightness value in the whole simulation image visual field is calculated, and then pixel points which exceed the brightness value by more than 4 times are taken as the glare source pixel points, so that the judgment condition of the glare source pixel points is established. In the embodiment, the XMP file is read through Visual Basic language programming, the brightness value of each pixel point in the simulation image is compared with the average brightness of all the pixel points of the whole simulation image, and a plurality of independent glare source pixel points are judged according to the glare source judgment condition.
Background luminance L for an imagebTraversing all other pixel points except the pixel point judged as the glare source in the image, and calculating the average brightness value of the pixel points as the background brightness LbThe expression is as follows:
<math> <mrow> <msub> <mi>L</mi> <mi>b</mi> </msub> <mo>=</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>g</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>G</mi> </munderover> <msub> <mi>L</mi> <mi>g</mi> </msub> <mo>/</mo> <mi>G</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein: l isgFor the luminance (cd/m) of the g-th non-glare source pixel point in the simulated image2) (ii) a G is the number of non-glare light source pixel points in the image.
The specific process of Step3 is as follows:
according to the characteristics of UGR model and small light source UGR correction model calculation parameters, the independent glare source pixels are integrated into a glare source area, so that the calculation factors such as the boundary and the type of the glare source can be judged, and the adjacent independent glare source pixels are integrated and applied with the following algorithm:
traversing the t glare source pixel points determined in Step2, if the pixel points adjacent to one glare source pixel point are also glare source pixel points (only the left side pixel point and the upper side pixel point need to be determined because the traversal process is from the upper left corner pixel point to the lower right corner pixel point of the image), then they belong to the same glare area, at this time, the number of the glare source pixel point is changed to be the same as the number of the first pixel point in the glare area, that is, the numbers of the glare source pixel points belonging to the same glare area are unified, and by analogy, a plurality of glare areas (that is, a plurality of sets of glare source pixel points with the same number in each set) can be obtained. The integration process is completed, and the codes are as follows:
the specific process of Step4 is as follows:
after the integration of the independent glare source pixel points is completed, some independent glare source pixel points (scattered points) may still exist in the visual simulation image, and since the area of the scattered point glare sources is too small, the scattered point glare sources often do not exist even in practice. The influence of the discomfort glare model on the evaluation result can be ignored according to the discomfort glare model, so the screening condition is set to screen out the glare source region formed by not more than 2-10 glare source pixel points, and the unreasonable glare source region meeting the screening condition is screened out, wherein the specific number in the screening condition is 2 or 3 or 4, and the like according to the actual requirements.
The specific process of Step5 is as follows:
completing the extraction, determination and integration from Step1 to Step4After the combining and screening process, dividing the obtained glare area into general glare area (the projection area is more than or equal to 0.005 m) according to the projection area of the glare area2) And a small glare source region (projection area less than 0.005 m)2) Respectively collecting relevant parameters as follows:
luminance L for general glare source regioni’Taking the average value of the brightness of all pixel points in the general glare source region as the brightness value of the general glare source region, wherein the expression is as follows:
<math> <mrow> <msub> <mi>L</mi> <msup> <mi>i</mi> <mo>&prime;</mo> </msup> </msub> <mo>=</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>k</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>K</mi> </munderover> <msub> <mi>L</mi> <mi>k</mi> </msub> <mo>/</mo> <mi>K</mi> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>2</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein: l iskThe luminance (cd/m) of the k-th pixel in the general glare source region2) (ii) a K is the number of pixel points in the general glare source area.
For the solid angle ω of the general glare source region, the following formula can be used to calculate:
<math> <mrow> <mi>&omega;</mi> <mo>=</mo> <mfrac> <msub> <mi>A</mi> <mi>p</mi> </msub> <msup> <mi>r</mi> <mn>2</mn> </msup> </mfrac> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>3</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein: a. thepThe geometric center of the general glare source region andprojected area (m) on normal plane of train driver eyepoint connecting line2) (ii) a r is the distance (m) from the geometric center of the general glare source area to the eyepoint of the train driver.
For the light intensity I of the small glare source region, according to the inverse square law, the light intensity and the illumination intensity have the following relationship:
I=E×r2 (4)
wherein, I is the light intensity (cd) of the reflecting surface and the small glare source area on the instrument panel in the sight line direction of the train driver; e is the illuminance (lx) of the small glare source region at the human eye; and r is the distance (m) from the geometric center of the small glare source area to human eyes. The inverse square law applies to point light sources, but the distance between a calculation point and a light source is more than 4 times of the diameter of the light source, and the law is established, and the illumination of the light source on an illuminated surface (a driver eye point) is collected and converted into luminous intensity.
For the distance r from the geometric center of the small glare source region to the eyepoint, calculating the coordinates of the geometric center of the light source, and acquiring the distance r from the geometric center of the small glare source region to the driver eyepoint by using a GetDepth (X, Y) method in SPEOS CAA V5Based V16.1.1, wherein the expression is as follows:
r = ( X q - X e ) 2 + ( Y q - Y e ) 2 + ( Z q - Z e ) 2 - - - ( 5 )
wherein: (X)q,Yq,Zq) Is the coordinate of the geometric center of the qth small-glare source region, (X)e,Ye,Ze) The coordinates of the driver's eyepoint.
For the position index P of the geometric center of the general glare source region or the geometric center of the small glare source region relative to the eyepoint of a driver, the position index based on Luchiesh and Guth research is adopted, and the expression forms of the position index are various, such as a position index table, a position index expression taking the included angle between human eyes and a light source as a variable and the like. The embodiment adopts a position index expression which is convenient for computer operation:
1 P = d 2 E X P d 2 + 1.5 d + 4.6 + 0.12 ( 1 - E X P ) - - - ( 6 )
wherein: p is a position index of the geometric center of the glare source region;d ═ Y/Z |; s ═ X/Z |; x, Y and Z are distances from the geometric center of the glare source region to three coordinate axes, as shown in FIG. 2.
The specific process of Step6 is as follows:
the parameter values acquired in the above processes and calculated by the formulas (1) to (6) are input into the UGR model and the small light source UGR correction model based on the embodiment, the train driving interface lighting discomfort glare evaluation model established in consideration of the glare distribution characteristics of the train driving interface is calculated to obtain the UGR value of the driving interface lighting discomfort glare, and the model formula is as follows:
<math> <mrow> <mi>U</mi> <mi>G</mi> <mi>R</mi> <mo>=</mo> <mn>8</mn> <mi>l</mi> <mi>o</mi> <mi>g</mi> <mfrac> <mn>0.25</mn> <msub> <mi>L</mi> <mi>b</mi> </msub> </mfrac> <mrow> <mo>(</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>h</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>H</mi> </munderover> <mfrac> <mrow> <msubsup> <mi>L</mi> <mi>h</mi> <mn>2</mn> </msubsup> <msub> <mi>&omega;</mi> <mi>h</mi> </msub> </mrow> <msubsup> <mi>P</mi> <mi>h</mi> <mn>2</mn> </msubsup> </mfrac> <mo>+</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>q</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>Q</mi> </munderover> <mfrac> <mrow> <mn>200</mn> <msubsup> <mi>I</mi> <mi>q</mi> <mn>2</mn> </msubsup> </mrow> <mrow> <msubsup> <mi>r</mi> <mrow> <mi>q</mi> <mi>q</mi> </mrow> <mn>2</mn> </msubsup> <msup> <mi>P</mi> <mn>2</mn> </msup> </mrow> </mfrac> <mo>)</mo> </mrow> <mo>-</mo> <mo>-</mo> <mo>-</mo> <mrow> <mo>(</mo> <mn>7</mn> <mo>)</mo> </mrow> </mrow> </math>
wherein: h is the total number of general glare source regions; q is the total number of the small glare source area; l ishLuminance value (cd/m2) of h-th general glare source region; l isbIs the background luminance value (cd/m2) of the image; omegahA solid angle (sr) being the h-th general glare source region; phIs the position index of the geometric center of the h-th general glare source area; i isqThe light intensity (cd) of the qth small glare source region in the direction of the eyepoint of the train driver, rqDistance (m) from eyes for the qth small-glare source region; pqIs an index of the position of the geometric center of the qth small-glare source region.
And (5) according to the UGR value obtained by calculation with the following UGR and subjective feeling relation table, and finally finishing the evaluation of the lighting glare of the train driving interface.
It should be understood that the above-mentioned embodiments of the present invention are only examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention, and it will be obvious to those skilled in the art that other variations or modifications may be made on the basis of the above description, and all embodiments may not be exhaustive, and all obvious variations or modifications may be included within the scope of the present invention.

Claims (4)

1. A discomfort glare evaluation method for train driving interface illumination is characterized by comprising the following steps:
s1, establishing a train driving interface three-dimensional model, and obtaining a driver visual simulation image and coordinate values, brightness values and illumination value parameters of each pixel point in the image through an optical simulation tool;
s2, judging glare source pixel points from the pixel points according to the set judgment condition, and calculating the average value of the brightness values of the non-glare source pixel points in the image to be used as a background brightness value;
s3, integrating adjacent glare source pixel points in each glare source pixel point into one glare source area to obtain a plurality of glare source areas;
s5, dividing each glare source region into a general glare source region and a small glare source region according to the area size, collecting the brightness value, the solid angle and the position index of a geometric center of each general glare source region, and collecting the luminous intensity, the distance from the geometric center to the eye point of a train driver and the position index of the geometric center of each small glare source region;
s6, calculating a unified glare value UGR of the image according to the train driving interface illumination discomfort glare evaluation model, and evaluating discomfort glare of train driving interface illumination according to the unified glare value, wherein the model formula of the train driving interface illumination discomfort glare evaluation model is as follows:
<math> <mrow> <mi>U</mi> <mi>G</mi> <mi>R</mi> <mo>=</mo> <mn>8</mn> <mi>l</mi> <mi>o</mi> <mi>g</mi> <mfrac> <mn>0.25</mn> <msub> <mi>L</mi> <mi>b</mi> </msub> </mfrac> <mrow> <mo>(</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>h</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>H</mi> </munderover> <mfrac> <mrow> <msup> <msub> <mi>L</mi> <mi>h</mi> </msub> <mn>2</mn> </msup> <msub> <mi>&omega;</mi> <mi>h</mi> </msub> </mrow> <msubsup> <mi>P</mi> <mi>h</mi> <mn>2</mn> </msubsup> </mfrac> <mo>+</mo> <munderover> <mo>&Sigma;</mo> <mrow> <mi>q</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>Q</mi> </munderover> <mfrac> <mrow> <mn>200</mn> <msubsup> <mi>I</mi> <mi>q</mi> <mn>2</mn> </msubsup> </mrow> <mrow> <msubsup> <mi>r</mi> <mi>q</mi> <mn>2</mn> </msubsup> <msubsup> <mi>P</mi> <mi>q</mi> <mn>2</mn> </msubsup> </mrow> </mfrac> <mo>)</mo> </mrow> </mrow> </math>
in the formula, H is the total number of the general glare source area; q is the total number of the small glare source area; l ishLuminance value (cd/m2) of h-th general glare source region; l isbIs the background luminance value (cd/m2) of the image; omegahA solid angle (sr) being the h-th general glare source region; phIs the position index of the geometric center of the h-th general glare source area; i isqThe light intensity (cd) of the qth small glare source region in the sight direction of the train driver, rqDistance (m) from eyes for the qth small-glare source region; pqIs an index of the position of the geometric center of the qth small-glare source region.
2. The method for evaluating discomfort glare from train driving interface lighting of claim 1, further comprising the following steps after step S3 and before step S5:
s4, screening out the glare source region formed by no more than 2-10 glare source pixel points.
3. The discomfort glare evaluation method for train driving interface lighting according to claim 1, wherein the determination conditions of the glare source pixel points in step S2 are: and taking the pixel points with the brightness value more than 4 times of the average brightness value of all the pixel points in the image as the glare source pixel points.
4. The discomfort glare evaluation method for train driving interface lighting according to claim 1, wherein the method of dividing each glare source region into a general glare source region and a small glare source region by area size in step S5 is as follows: the projection area is more than or equal to 0.005m2As general glareSource region, projected area is less than 0.005m2As a small glare source region.
CN201510317586.9A 2015-06-11 2015-06-11 Evaluation method for discomfort glare of train driving interface illumination Pending CN104933241A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510317586.9A CN104933241A (en) 2015-06-11 2015-06-11 Evaluation method for discomfort glare of train driving interface illumination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510317586.9A CN104933241A (en) 2015-06-11 2015-06-11 Evaluation method for discomfort glare of train driving interface illumination

Publications (1)

Publication Number Publication Date
CN104933241A true CN104933241A (en) 2015-09-23

Family

ID=54120407

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510317586.9A Pending CN104933241A (en) 2015-06-11 2015-06-11 Evaluation method for discomfort glare of train driving interface illumination

Country Status (1)

Country Link
CN (1) CN104933241A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105042399A (en) * 2015-07-30 2015-11-11 苏州玄禾物联网科技有限公司 Image-analysis-based desk lamp brightness self-adaptive adjustment method
CN105758624A (en) * 2016-04-12 2016-07-13 上海科涅迩光电技术有限公司 Glare testing method and system
CN106897535A (en) * 2017-03-09 2017-06-27 同济大学 A kind of high ferro illuminates the evaluation method to closing on Parallel Freeways glare effects
CN107977509A (en) * 2017-11-29 2018-05-01 中国直升机设计研究所 A kind of night helicopter cockpit dazzle quantitative estimation method
CN108990200A (en) * 2018-06-06 2018-12-11 西南交通大学 A kind of bullet train illumination non-comfort glare simulation system control method
CN109131076A (en) * 2017-06-27 2019-01-04 上海蔚兰动力科技有限公司 Driving assistance system and driving assistance method
CN109283855A (en) * 2018-11-05 2019-01-29 哈尔滨工业大学 A kind of architecture indoor dazzle immersion emulation mode based on local sky model
CN110135235A (en) * 2019-03-13 2019-08-16 北京车和家信息技术有限公司 A kind of dazzle processing method, device and vehicle
CN110487559A (en) * 2018-05-15 2019-11-22 上汽通用汽车有限公司 In-vehicle reflection and the measuring device dazzle the eyes and measurement method
CN111141497A (en) * 2020-01-26 2020-05-12 昆山适途模型科技有限公司 Optical verification equipment with simulated sunlight and verification method
CN112539919A (en) * 2020-11-03 2021-03-23 神龙汽车有限公司 Night-vision illumination effect evaluation method for car lamp

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102151120A (en) * 2011-02-24 2011-08-17 复旦大学 System for dynamically estimating glare
CN102967442A (en) * 2011-08-31 2013-03-13 株式会社东芝 Method for evaluating discomfort glare and discomfort glare evaluation program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102151120A (en) * 2011-02-24 2011-08-17 复旦大学 System for dynamically estimating glare
CN102967442A (en) * 2011-08-31 2013-03-13 株式会社东芝 Method for evaluating discomfort glare and discomfort glare evaluation program

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JAN WIENOLD 等: "Evaluation methods and development of a new glare prediction model for daylight environments with the use of CCD cameras", 《ENERGY AND BUILDINGS》 *
张炜 等: "基于SPEOS/CATIA的飞机驾驶舱眩光量化评估方法", 《系统工程理论与实践》 *
杨公侠: "UGR——统一眩光评价系统(一)", 《光源与照明》 *

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105042399A (en) * 2015-07-30 2015-11-11 苏州玄禾物联网科技有限公司 Image-analysis-based desk lamp brightness self-adaptive adjustment method
CN105758624A (en) * 2016-04-12 2016-07-13 上海科涅迩光电技术有限公司 Glare testing method and system
CN106897535A (en) * 2017-03-09 2017-06-27 同济大学 A kind of high ferro illuminates the evaluation method to closing on Parallel Freeways glare effects
CN106897535B (en) * 2017-03-09 2020-06-02 同济大学 Method for evaluating influence of high-speed rail illumination on glare of adjacent parallel highway
CN109131076A (en) * 2017-06-27 2019-01-04 上海蔚兰动力科技有限公司 Driving assistance system and driving assistance method
CN107977509A (en) * 2017-11-29 2018-05-01 中国直升机设计研究所 A kind of night helicopter cockpit dazzle quantitative estimation method
CN110487559A (en) * 2018-05-15 2019-11-22 上汽通用汽车有限公司 In-vehicle reflection and the measuring device dazzle the eyes and measurement method
CN108990200A (en) * 2018-06-06 2018-12-11 西南交通大学 A kind of bullet train illumination non-comfort glare simulation system control method
CN109283855A (en) * 2018-11-05 2019-01-29 哈尔滨工业大学 A kind of architecture indoor dazzle immersion emulation mode based on local sky model
CN110135235A (en) * 2019-03-13 2019-08-16 北京车和家信息技术有限公司 A kind of dazzle processing method, device and vehicle
CN110135235B (en) * 2019-03-13 2022-04-19 北京车和家信息技术有限公司 Glare processing method and device and vehicle
CN111141497A (en) * 2020-01-26 2020-05-12 昆山适途模型科技有限公司 Optical verification equipment with simulated sunlight and verification method
CN112539919A (en) * 2020-11-03 2021-03-23 神龙汽车有限公司 Night-vision illumination effect evaluation method for car lamp

Similar Documents

Publication Publication Date Title
CN104933241A (en) Evaluation method for discomfort glare of train driving interface illumination
Kim et al. Real-time daylight glare control using a low-cost, window-mounted HDRI sensor
KR102188282B1 (en) Visual function inspection and optical property calculation system
CN104870967A (en) Method for checking the compliance of an optical characteristic of an ophthalmic lens and associated device
CN105787989A (en) Measurement texture geometric feature reconstruction method based on photometric stereo
JP7100500B2 (en) Visual function test and optical characteristic calculation system
US20030076479A1 (en) Method for evaluating binocular performance of spectacle lenses, method for displaying binocular performance, and apparatus therefore
CN108401318A (en) Intelligent lighting system and method based on object surface three-dimensional morphology analysis
CN110446967A (en) For determining the computer implemented method of the expression at spectacle frame edge or the expression at eyeglass edge
EP4264556A1 (en) Method for assessing the physically based simulation quality of a glazed object
CN112839414A (en) Lighting device and control method thereof
Budak et al. Evaluation of illumination quality based on spatial-angular luminance distribution
CN113125127A (en) Optical scene simulation method and device based on human eye vision
CN103969029A (en) Digital-camera-based simple glaring testing method
Kim et al. The scope of the glare light source of the window with non-uniform luminance distribution
Meseth et al. Verification of rendering quality from measured BTFs
CN106935182B (en) The brightness adjusting method and device of display screen
CN105223763B (en) Brightness automatic equalization device and the method in 3D projection systems is reclaimed for light
CN109143758A (en) The technology of automatic enhancing optical projection effect
CN115331614A (en) Brightness debugging method and system for LED display screen
CN115688426A (en) Perceptual brightness characterization method for flat panel display device display
CN116306319B (en) Cultural relic illumination glare quantitative evaluation method and system based on genetic algorithm
CN109283855B (en) Building indoor glare immersive simulation method based on local sky model
Liu et al. How bright should a virtual object be to appear opaque in optical see-through AR?
CN114255641B (en) Manufacturing method and system of simulated light source in virtual machine vision system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20150923

RJ01 Rejection of invention patent application after publication