[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114257732A - Detection method of shooting equipment and related device - Google Patents

Detection method of shooting equipment and related device Download PDF

Info

Publication number
CN114257732A
CN114257732A CN202110016897.7A CN202110016897A CN114257732A CN 114257732 A CN114257732 A CN 114257732A CN 202110016897 A CN202110016897 A CN 202110016897A CN 114257732 A CN114257732 A CN 114257732A
Authority
CN
China
Prior art keywords
target
image
parameter
shooting
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110016897.7A
Other languages
Chinese (zh)
Inventor
徐天宇
闫冬升
李纪楷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to PCT/CN2021/093015 priority Critical patent/WO2022062421A1/en
Publication of CN114257732A publication Critical patent/CN114257732A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The method determines whether the target shot by the shooting equipment meets the requirement or not by comparing the difference between the effect parameter of the target and the preset range, determines the adjusting mode of the relative position of the shooting equipment according to the comparison result of the effect parameter and the preset range, provides related suggestions, and enables inexperienced installation personnel to adjust according to the adjusting mode or directly adopt the adjusting mode to automatically adjust.

Description

Detection method of shooting equipment and related device
The present application claims priority of chinese patent application entitled "a method for determining camera tuning" filed by chinese patent office on 22/9/2020 and having application number 202011003383.X, the entire contents of which are incorporated herein by reference.
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a detection method for a shooting device and a related apparatus.
Background
With the development of the technology, the traditional camera is gradually replaced by the intelligent camera, the intelligent camera is not limited to the original video recording and common warning functions any more, and intelligent functions such as face detection, face recognition, vehicle detection, license plate recognition and the like are evolved.
The face recognition and the license plate recognition of the intelligent camera need clear and accurate snapshot pictures, and strict requirements are set for the erection of the intelligent camera. However, most of the existing installation of the intelligent camera is operated by non-technically researched and developed installation personnel, and if inexperienced installation personnel install and adjust the intelligent camera, the final imaging effect of the intelligent camera is generally poor. Therefore, a detection method is needed to detect whether the snapshot of the smart camera meets the requirement and provide related suggestions to help inexperienced installers to perform installation adjustment.
Disclosure of Invention
The application provides a detection method and a related device of shooting equipment, which can detect pictures shot by an intelligent camera and give related suggestions.
The application provides a detection method and a related device of shooting equipment, wherein the method can acquire an image shot by the shooting equipment and detect a target in the image; determining effect parameters of the target, wherein the effect parameters mainly comprise the relative position of the target relative to the shooting equipment; and if the effect parameter is not in the preset range, determining an adjusting mode of the relative position of the shooting equipment according to a comparison result of the effect parameter and the preset range. The method comprises the steps of comparing the difference between the effect parameter of a target and a preset range, determining whether the target shot by the shooting equipment meets the requirement, determining the adjusting mode of the relative position of the shooting equipment according to the comparison result of the effect parameter and the preset range, giving out relevant suggestions, and enabling inexperienced installation personnel to adjust according to the adjusting mode or directly adopting the adjusting mode to automatically adjust.
In combination with the first aspect, in one implementation form of the present application, the relative position of the object with respect to the photographing apparatus includes an angle parameter of the object with respect to the photographing apparatus, the angle parameter including a pitch angle and/or a yaw angle and/or a roll angle. In this implementation, the pitch angle and/or yaw angle and/or roll angle can be used to accurately determine the orientation of the target relative to the capture device, to determine an appropriate adjustment, to perform an automatic adjustment, or to provide an adjustment recommendation.
With reference to the first aspect, in an implementation manner of the present application, the determining, according to a comparison result between an effect parameter of a target and a preset range, an adjustment manner of a relative position of a shooting device includes: if the pitch angle in the angle parameters is not within the preset range, determining one of the adjustment modes as upward or downward rotation of the shooting equipment; if the yaw angle in the angle parameter is not within the preset range, determining one of the adjusting modes as rotating the shooting equipment leftwards or rightwards; and if the roll angle in the angle parameters is not in the preset range, determining one of the adjusting modes as clockwise or anticlockwise rotating the shooting equipment. In this implementation, the pitch angle and/or yaw angle and/or roll angle can be used to accurately determine the orientation of the target relative to the capture device, to determine an appropriate adjustment, to perform an automatic adjustment, or to provide an adjustment recommendation.
With reference to the first aspect, in one implementation manner of the present application, a relative position of the target with respect to the shooting device includes a position parameter of the target in the image, and the position parameter is a coordinate of the target in the image. In the implementation mode, the position of the target relative to the shooting device can be accurately determined by adopting the coordinates of the target in the image, so that a proper adjusting mode is determined, and automatic adjustment is executed or adjustment suggestions are provided.
With reference to the first aspect, in an implementation manner of the present application, the determining, according to a comparison result between an effect parameter of a target and a preset range, an adjustment manner of a relative position of a shooting device includes: if the abscissa parameter in the position parameters is not in the preset range, determining one of the adjustment modes as a transverse translation shooting device; and if the ordinate parameter in the position parameters is not in the preset range, determining one of the adjustment modes as the longitudinal translation shooting equipment. In the implementation mode, the position of the target relative to the shooting device can be accurately determined by adopting the coordinates of the target in the image, so that a proper adjusting mode is determined, and automatic adjustment is executed or adjustment suggestions are provided.
With reference to the first aspect, in an implementation manner of the present application, the effect parameter includes a resolution of the target, and the determining, according to a comparison result between the effect parameter of the target and the preset range, an adjustment manner of the relative position of the shooting device includes: if the resolution of the target in the image is smaller than a first preset threshold, determining one of the adjustment modes as zooming in the distance between the shooting equipment and the shooting object thereof or lengthening the focal length of the shooting equipment; if the resolution of the target in the image is larger than a first preset threshold, determining one of the adjustment modes as to increase the distance between the shooting device and the shooting object thereof or shorten the focal length of the shooting device. In the implementation mode, the size of the area occupied by the target in the image can be accurately determined by adopting the resolution of the target in the image, and the size is related to the shooting distance between the target and the shooting device, so that a proper adjusting mode can be determined according to the shooting distance, and automatic adjustment can be executed or adjustment suggestions can be provided.
With reference to the first aspect, in an implementation manner of the present application, the effect parameter includes a definition parameter of the target, and the adjusting manner for determining the relative position of the shooting device according to the comparison result between the effect parameter of the target and the preset range includes: and if the definition parameter of the target is not in the preset range, determining one of the adjusting modes as adjusting the focal length of the shooting equipment. In the implementation mode, the definition degree of the target in the image can be accurately determined by adopting the definition parameter of the target, and the definition degree is related to whether the shooting equipment accurately focuses the target, so that a proper adjustment mode can be determined according to the focal length, and automatic adjustment is performed or adjustment suggestions are provided.
With reference to the first aspect, in an implementation manner of the present application, after determining an adjustment manner of a relative position of a shooting device according to a comparison result of an effect parameter of a target and a preset range, the method further includes: and adjusting the shooting equipment and/or displaying the adjusting mode according to the adjusting mode. In this implementation, the adjustment mode may be displayed, or may be used as a basis for automatic adjustment.
With reference to the first aspect, in an implementation manner of the present application, after determining the effect parameter of the target, before determining an adjustment manner of the relative position of the shooting device according to a comparison result between the effect parameter of the target and the preset range, the method further includes: acquiring parameter information of shooting equipment; and synthesizing the parameter information of the shooting equipment, the effect parameters of the target and the image into a display image and displaying the display image. In the implementation mode, the current information can be clearly displayed through one display image, so that an installer can acquire key information when observing the display image, and the shooting equipment is adjusted more easily.
In a second aspect, the present application provides a detection apparatus for a photographing device, including: the acquisition module is used for acquiring an image; the processing module is used for detecting a target in the image according to the image; the processing module is further used for determining effect parameters of the target, and the effect parameters comprise the relative position of the target relative to the shooting equipment; if the effect parameter of the target is within the preset range, the processing module is further configured to determine an adjustment mode of the relative position of the shooting device according to a comparison result of the effect parameter of the target and the preset range.
In combination with the second aspect, in one implementation of the present application, the relative position of the target with respect to the photographing apparatus includes angular parameters of the target with respect to the photographing apparatus, the angular parameters including a pitch angle and/or a yaw angle and/or a roll angle.
With reference to the second aspect, in an implementation manner of the present application, the processing module is further configured to: if the pitch angle in the angle parameters is not within the preset range, determining one of the adjustment modes as upward or downward rotation of the shooting equipment; if the yaw angle in the angle parameter is not within the preset range, determining one of the adjusting modes as rotating the shooting equipment leftwards or rightwards; and if the roll angle in the angle parameters is not in the preset range, determining one of the adjusting modes as clockwise or anticlockwise rotating the shooting equipment.
With reference to the second aspect, in one implementation manner of the present application, the relative position of the target with respect to the shooting device includes a position parameter of the target in the image, and the position parameter is a coordinate of the target centered in the image.
With reference to the second aspect, in an implementation manner of the present application, the processing module is further configured to: if the abscissa parameter in the position parameters is not in the preset range, determining one of the adjustment modes as a transverse translation shooting device; and if the ordinate parameter in the position parameters is not in the preset range, determining one of the adjustment modes as the longitudinal translation shooting equipment.
With reference to the second aspect, in an implementation manner of the present application, the effect parameter includes a resolution of the target, and the processing module is further configured to: if the resolution of the target in the image is smaller than a first preset threshold, determining one of the adjustment modes as zooming in the distance between the shooting equipment and the shooting object thereof or lengthening the focal length of the shooting equipment; if the resolution of the target in the image is larger than a first preset threshold, determining one of the adjustment modes as to increase the distance between the shooting device and the shooting object thereof or shorten the focal length of the shooting device.
With reference to the second aspect, in an implementation manner of the present application, the effect parameter includes a definition parameter of the target, and the processing module is further configured to: and if the definition parameter of the target is not in the preset range, determining one of the adjusting modes as adjusting the focal length of the shooting equipment.
With reference to the second aspect, in an implementation manner of the present application, the processing module is further configured to: and adjusting the shooting equipment and/or displaying the adjusting mode according to the adjusting mode.
With reference to the second aspect, in an implementation manner of the present application, the obtaining module is further configured to obtain parameter information of the shooting device; a processing module further configured to: and synthesizing the parameter information of the shooting equipment, the effect parameters of the target and the image into a display image and displaying the display image.
In a third aspect, the present application provides an image pickup apparatus comprising a lens, a sensor, and a processor, wherein: the lens is used for receiving light rays, and the sensor is used for performing photoelectric conversion on the light rays received by the lens to generate an image; the processor is configured to: the method of the first aspect is implemented.
Drawings
FIG. 1 is a block diagram of an embodiment of the present application;
fig. 2 is a flowchart of a detection method of a shooting device according to an embodiment of the present disclosure;
FIG. 3 is an interface diagram of a client according to an embodiment of the present disclosure;
FIG. 4 is a diagram of a diagnostic result feedback interface provided by an embodiment of the present application;
FIG. 5 is an exemplary diagram of a presentation image provided by an embodiment of the present application;
FIG. 6 is a diagram illustrating an effect of the present application after adjustment;
FIG. 7 is a schematic flow chart diagram of an embodiment of the present application;
FIG. 8 is a signaling diagram of an embodiment of the present application;
fig. 9 is a schematic diagram of a detection device of a shooting device according to an embodiment of the present disclosure;
fig. 10 is a schematic diagram of an image capturing apparatus according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a detection method and a related device of shooting equipment, which can detect pictures shot by an intelligent camera, automatically adjust the camera or give related suggestions.
The terms "first," "second," "third," "fourth," and the like in the description and in the claims of the present application and in the drawings described above, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "corresponding" and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
With the development of the technology, the traditional camera is gradually replaced by the intelligent camera, the intelligent camera is not limited to the original video recording and common warning functions any more, and intelligent functions such as face detection, face recognition, vehicle detection, license plate recognition and the like are evolved. The face recognition and the license plate recognition of the intelligent camera need clear and accurate snapshot pictures, so that strict requirements on the erection and installation of the camera are met. Most of the existing cameras are operated by non-technically researched and developed installers, and experience among the installers is uneven, so that final imaging effects of the cameras are greatly different. Although the camera is also accompanied by erection installation instruction guidance such as installation height and installation angle when being shipped from a factory, the field installation environment is often very complicated and cannot necessarily meet the erection conditions, so that the face recognition rate or the license plate recognition rate is low.
In view of this, embodiments of the present application provide a method and a related apparatus for detecting a shooting device, which can detect a picture shot by an intelligent camera, automatically adjust the camera, or provide related suggestions. For clarity and conciseness of the following descriptions of the embodiments, first, an architectural diagram of an embodiment of the present application is given:
fig. 1 is a schematic architecture diagram according to an embodiment of the present application. The architecture includes a camera device and a client/server. The shooting device may be any device capable of shooting images, such as a fixed camera, a pan-tilt camera, a monitoring camera, and the like, which is not limited in the embodiments of the present application. The client may be a client installed on the terminal device, the shooting device may communicate with the terminal device through wired connection or wireless connection, the terminal device may be a mobile phone, a computer, a tablet, or the like, and the client may be an Application (APP) installed on the mobile phone or the tablet, or an wed client on a webpage, which is not limited in this embodiment of the present application. When the detection method of the shooting device provided by the embodiment of the application is realized, one of the realization modes is as follows: and clicking a trigger button on the client by the user to enable the terminal equipment to send a trigger signal to the shooting equipment. Then, the shooting device may execute the detection method of the shooting device provided in the embodiment of the present application after acquiring the image, and transmit the obtained adjustment mode to the client through the corresponding interface, so that the client presents the adjustment mode. The user can directly carry out manual adjustment on the shooting equipment according to the presented adjustment mode, and can also click the automatic adjustment button, so that the terminal equipment can send an automatic adjustment instruction to the shooting equipment, and the shooting equipment can carry out automatic adjustment. For example, the pan/tilt camera may be adjusted by rotating up, down, left, and right according to an automatic adjustment instruction. In another implementation, the user clicks a trigger button on the client, so that the terminal device sends a trigger signal to the shooting device. Then, the shooting device may send the acquired image to the client, so that the client executes the detection method of the shooting device provided by the embodiment of the application, and then presents the obtained adjustment mode to the user. The subsequent situation is similar to the previous implementation and is not described here again.
In other embodiments, the capture device may communicate with the server via a wired or wireless connection, the server may be connected to a plurality of terminal devices, and the plurality of terminal devices may access the server to obtain information about the capture device. When the detection method of the shooting device provided by the embodiment of the application is realized, one of the realization modes is as follows: the shooting device can execute the detection method of the shooting device provided by the embodiment of the application after acquiring the image, the obtained adjustment mode is transmitted to the server through the corresponding interface, and the server provides the corresponding interface, so that a user can inquire the adjustment mode through the terminal device. Then, the user can manually adjust the shooting device according to the inquired adjustment mode, and can also send an automatic adjustment instruction to the shooting device through the terminal device and the server, so that the shooting device can be automatically adjusted. In another implementation manner, the server may receive image data of the shooting device, and then execute the detection method of the shooting device provided in the embodiment of the present application to obtain a related adjustment manner, where other cases are similar to the previous implementation manner, and are not described here again.
To sum up, the detection method for the shooting device provided in the embodiment of the present application may be executed by the shooting device, the client, or the server, and the embodiment of the present application is described with the shooting device as an example, and the execution situations of other devices are similar, which are not described herein again.
Fig. 2 is a flowchart of a detection method of a shooting device according to an embodiment of the present application. The process comprises the following steps:
201. acquiring an image shot by shooting equipment;
in this application embodiment, after the staff tentatively installed the shooting equipment, the power was connected, then the shooting equipment can shoot the image. In practical applications, the photographing apparatus may convert the data format of the photographed image into a format suitable for subsequent processing. Illustratively, the photographing apparatus may convert a photographed image into a YUV data format. In practical application, because the shooting device adopts different systems, algorithms, programming languages and the like, the data format of the image can adopt other suitable formats, and the data format of the image is not limited in the embodiment of the application.
Fig. 3 is an operation interface diagram of a client in the embodiment of the present application. In the interface diagram, the upper right side is a detection control interface, the type of image detection can be selected in the first column, namely, the shooting equipment is mainly used for detecting a human face or a license plate. In practical applications, the shooting device may also be used to detect other types of targets, and a corresponding button may be added to the field. It is understood that, when the user selects one of the buttons in the detection control interface, the photographing apparatus may select an object detection algorithm corresponding to the selected button to detect the object in step 202. Therefore, the embodiment of the application can actually perform snapshot analysis on the face and the license plate respectively, and can also perform snapshot analysis on other types of targets according to actual needs, so that the application scene of the application is greatly expanded. The second column is a detection area list. The user may select "polygon area drawing", "rectangle area drawing", "full screen drawing", or "clear drawing" among them. The user clicks the 'polygon area drawing' to add a preset polygon area into the detection area, the user clicks the 'rectangle area drawing' to add a preset rectangle area into the detection area, the user clicks the 'full screen drawing' to set the detection area to be a full screen, and the user clicks the 'clear drawing' to cancel all the settings. The third column is a "manual snap" button or an "open continuous tune" button. And triggering the snapshot function of the shooting equipment by clicking the manual snapshot button by the user, executing the step 201 by the shooting equipment to snapshot the image, and executing the subsequent steps to obtain a relevant adjustment mode to be presented in a diagnosis result feedback interface at the lower right side. The user clicks the "turn on continuous tune" button and selects the time interval between each image taken, for example the example in fig. 3 for continuous tune at 300 second intervals. After clicking the button of turning on continuous tuning, the user can make the shooting device execute step 201 to snapshot images every 300 seconds, and execute the subsequent steps to obtain the relevant adjustment mode, and the adjustment mode is presented in the diagnostic result feedback interface at the lower right side.
In the interface diagram shown in fig. 3, the lower right side is a diagnosis result feedback interface for displaying the detected information and the related adjustment manner. In fig. 3, the user temporarily does not click the "manual snapshot" button or the "turn on continuous tuning" button, and therefore the diagnostic result feedback interface temporarily does not present the diagnostic result. In the interface diagram shown in fig. 3, the left side is an image photographed by the photographing apparatus and an adjustment button, a mode option, and a speed option of the photographing apparatus. The number of the adjusting buttons is not limited in the embodiment of the application. The adjusting button can be an up-down rotating button, a left-right rotating button, a focal length adjusting button and the like of the shooting device, and the adjusting button is not limited in the embodiment of the application. The mode options include a continuous mode, a pause mode, and the like for controlling whether the photographing apparatus performs continuous photographing or pauses photographing. If the user selects the continuous mode, the user can continue to select the speed of continuous shooting on the interface. In practical application, other buttons may be further provided on the operation interface diagram, which is not limited in this embodiment of the application.
202. Detecting an object in the image according to the image;
in the embodiment of the application, the shooting device can detect the target in the image through a target detection algorithm. The target may be a human face, a license plate, or other objects, or may also be a pattern, a number, or the like, and the specific type of the target is not limited in the embodiments of the present application.
The shooting equipment can be configured with different target detection algorithms to detect the target in the image according to different types of targets. For example, if a face in an image needs to be detected, the shooting device may detect the face in the image through a face detection algorithm. If the license plate in the image needs to be detected, the shooting equipment can detect the license plate in the image through a license plate recognition algorithm. The target detection algorithms such as the face detection algorithm, the license plate recognition algorithm and the like can be realized by adopting a neural network model, and the embodiment of the application does not limit which target detection algorithm is adopted.
Fig. 4 is a diagnostic result feedback interface diagram provided in the embodiment of the present application. The left side of the figure is the image captured by the capture device. And, after the photographing apparatus performs step 202, the detected object is circled in the image by a rectangular frame area. It will be appreciated that the capture device may detect multiple objects in the image, and the capture device may frame those objects. In the subsequent step 204, the shooting device may randomly select one of the targets to determine the adjustment mode, or may select the target with the largest rectangular frame selection area to determine the adjustment mode, which is not limited in the embodiment of the present application.
203. Determining an effect parameter of the target;
in an embodiment of the application, the effect parameter of the object may comprise a relative position of the object with respect to the photographing apparatus. The relative position includes parameters such as a direction and a position of the target relative to the shooting device, which are not limited in the embodiment of the present application.
Illustratively, the effect parameter of the target is an angle parameter of the target relative to the photographing apparatus, such as euler angle, including pitch angle and/or yaw angle and/or roll angle. The photographing device may determine the euler angle of the object with respect to the photographing device through an angle detection algorithm. The embodiment of the application does not limit what kind of angle detection algorithm is adopted.
It is understood that when the photographing apparatus detects a plurality of objects from the image, one of the objects may be selected to calculate the effect parameters, and the adjustment manner is determined according to the effect parameters of the object in step 204, or the effect parameters of all the objects may be calculated, and then one of the objects is selected according to the effect parameters to perform step 204. In practical applications, the shooting device may also select the target through other selection manners, which is not limited in the embodiment of the present application.
In some embodiments, the effect parameters of the target further comprise a position parameter of the target in the image, which may be (the center of) the coordinates of the target in the image (including an abscissa parameter and an ordinate parameter). The shooting device can establish a rectangular coordinate system by taking the end point of the lower left corner of the image as an origin to determine the position parameter of the target in the image. In practical applications, the shooting device may also determine the position parameter of the target in the image by other ways, which is not limited in this embodiment of the application.
In some embodiments, the effect parameter of the target further comprises a resolution of the target. The resolution of the target may refer to the number of pixels of the target corresponding to the rectangular area, and, for example, the resolution 25px refers to 25 rows/columns of pixels arranged in the horizontal and vertical directions, thus, there are 625 pixels in total. In the same image, the size of the pixels is generally the same, so the larger the resolution of the object is, the larger the rectangular area corresponding to the object is. Therefore, whether the size of the rectangular area corresponding to the target meets the requirement or not can be determined according to the resolution of the target.
In some embodiments, the effect parameter of the target comprises a sharpness parameter of the target. The sharpness parameter of the object may refer to a parameter related to sharpness such as a bit rate. The capture device may read these parameters from the image.
In some embodiments, after step 203, the capturing device may further obtain its own parameter information, such as UID, name, and IP address of the capturing device, and obtain the time of the captured image, and then collectively refer to these parameter information, the effect parameter of the object, and the image as a presentation image. Fig. 5 is an exemplary diagram of a display image provided in an embodiment of the present application. In the display image, the top parameters are the name of the shooting device, the IP address, the time of shooting the image and the resolution of the target. The shooting device may provide a corresponding interface so that the client or the server may acquire the display image through the interface and display the display image.
It can be understood that the parameter information of the shooting device and the effect parameter of the target, which are acquired by the shooting device, can be displayed. The user can acquire the information from the relevant interface of the shooting device through the terminal device, and the terminal device can display the information on a display screen. As shown in fig. 4, the name of the shooting device is Camera 1, the IP address is x.x.x.x, the time for shooting the image is 1 month, 11 points, and 11 points in 2020, and the information of the object 1 is: the position parameter [ x:1000y:1000], resolution [25 px: 25px ], the information of the target 2 is the position [ x:100y:100], resolution [100 px: 100px ]. In practical application, the shooting device may also detect other information, so that a user can observe the information through a display screen of the terminal device.
204. And if the effect parameter of the target is not in the preset range, determining an adjusting mode according to a comparison result of the effect parameter of the target and the preset range.
In the embodiment of the application, the effect parameter of the target is not within the preset range, which indicates that the target does not meet the requirement of shooting, and an installer needs to further adjust the shooting equipment to obtain a better shooting effect. In the embodiment of the application, after the shooting device determines that the effect parameter of the target is not within the preset range, the adjustment mode can be determined according to the comparison result of the effect parameter of the target and the preset range, so as to automatically adjust or prompt an installer to adjust. The process of specifically determining the adjustment mode is as follows:
firstly, the effect parameters of the target comprise the relative position of the target relative to the shooting equipment, the relative position comprises angle parameters of the target relative to the shooting equipment, the angle parameters comprise a pitch angle and/or a yaw angle and/or a roll angle, and if the pitch angle in the angle parameters is not within a preset range, the shooting equipment determines one of the adjustment modes to be that the shooting equipment is rotated up and down; if the yaw angle in the angle parameter is not within the preset range, the shooting equipment determines one of the adjustment modes as the shooting equipment is rotated left and right; if the roll angle in the angle parameter is not within the preset range, the shooting device determines one of the adjustment modes to be clockwise or anticlockwise rotating the shooting device. For example, if the target on the lower left side of the image in fig. 4 is lower than the lower head, the shooting device detects that the pitch angle of the target is not within the preset range, and one of the adjustment modes is to rotate the shooting device up and down. Further, if the target is low at the head, the adjustment mode is upward shooting equipment. In practical applications, the adjustment mode may be expressed by different words, for example, "please raise the camera to decrease the pitch angle" in fig. 4, and the embodiment of the present application does not limit the specific words.
Secondly, the effect parameters of the target comprise the relative position of the target relative to the shooting equipment, the relative position comprises position parameters of the target in the image, the position parameters are coordinates of the target in the image, and if the abscissa parameters in the position parameters are not in the preset range, one of the adjusting modes is determined to be transverse translation shooting equipment; and if the ordinate parameter in the position parameters is not in the preset range, determining one of the adjustment modes as the longitudinal translation shooting equipment. For example, in fig. 4, the object on the lower left side of the image is closer to the left, and therefore, when the photographing apparatus detects that the abscissa parameter of the object is not within the preset range, it is determined that one of the adjustment modes is to horizontally translate the photographing apparatus. Specifically, the adjustment is "please pan the camera to the left so that the target is centered in the image".
Thirdly, if the resolution of the target comprises the resolution of the target, if the resolution of the target is smaller than a first preset threshold, determining one of the adjustment modes as to shorten the distance between the shooting equipment and the shooting object thereof or lengthen the focal length of the shooting equipment; if the resolution of the target is greater than the first preset threshold, determining one of the adjustment modes as to increase the distance between the shooting device and the shooting object thereof or shorten the focal length of the shooting device. It is understood that when the photographing apparatus has a function of adjusting the focal length, the adjustment may be performed by adjusting the focal length. When the shooting device is a device with a fixed focal length, the distance can be determined to be adjusted. For example, the resolution of the target in fig. 4 is smaller than the first preset threshold, which indicates that the rectangular area corresponding to the target is not large enough and the focal length needs to be reduced (the focal length of the shooting device is increased).
And fourthly, the effect parameters of the target comprise the definition parameters of the target, and if the definition parameters of the target are not in the preset range, one of the adjustment modes is determined to be the focal length of the shooting equipment. Illustratively, the sharpness of the target image in fig. 4 is sufficient, and no adjustment is necessary.
In some embodiments, after step 204, the shooting device may adjust the shooting device and/or present the adjustment according to the adjustment obtained in step 204. The user may click an "automatic adjustment" button on the operation interface shown in fig. 4, so as to trigger the shooting device to adjust the flow of the shooting device according to the adjustment manner obtained in step 204. When the shooting device receives a trigger signal related to the "auto adjustment" button, the shooting device may automatically adjust according to the adjustment obtained in step 204. It can be understood that a general camera can adjust the focal length, and a pan-tilt camera can not only adjust the focal length but also rotate the camera. In practical applications, the shooting device may determine the relevant adjustment function according to the device configuration thereof, so as to match the adjustment manner obtained in step 204. The device configuration of the shooting device in the embodiment of the application is not limited. The one-key adjustment function can further reduce the operation cost of the user and improve the usability of the system.
In some embodiments, the camera device may display the adjustment obtained in step 204, as shown in fig. 4. The method and the device for adjusting the installation position of the mobile terminal convert the adjusting mode into the popular language to prompt the user how to adjust, so that the installer can better perform manual adjustment. The display mode has better reference to the user and improves the simplicity of the use of the user.
Fig. 6 is a diagram illustrating an effect after adjustment according to an embodiment of the present application. When the shooting equipment is automatically adjusted or an installer adjusts the shooting equipment according to the adjustment suggestions, the shooting equipment can shoot a more appropriate target image to obtain a better imaging effect, so that the face recognition or the license plate recognition is more accurate.
Fig. 7 is a schematic flow chart according to an embodiment of the present application. In the embodiment of the present application, when a user triggers a snapshot (for example, the user clicks a "manual snapshot" button as in fig. 3), the shooting device acquires an image frame closest to the snapshot time from a live video stream for preprocessing, and converts the image frame into image information that can be processed by an algorithm. Then, the shooting device can extract the characteristic information in the image information through the algorithm model, then fit the characteristic information with the original data model, give out algorithm detection results, for example, whether the image information has a target and the relative position of the detected target, and finally integrate the algorithm detection results into an output result of the algorithm. After the detection result is given by the algorithm, the service processing module in the shooting equipment converts the detection result of the algorithm into an adjustment suggestion and feeds the adjustment suggestion back to the user. The user can make manual adjustment according to the adjustment suggestion or instruct the shooting device to make automatic adjustment.
Fig. 8 is a signaling diagram of an embodiment of the present application. The Client (Client) may be a web Client, and may be a Client carried by the camera. The user can access the web client through the terminal equipment, and the web client can communicate with the camera, acquire the information of the camera and send instructions to the camera. The client may display a video preview interface (as in fig. 3), trigger an operation switch (e.g., a "manual snap" button), and a result presentation (as in fig. 4). A request sent by the client to the core management module (imgavalchain module) is mainly used for notifying the camera server to judge or adjust the current image, and the request mainly contains basic information such as a judgment type (for example, face detection or license plate detection in the embodiment corresponding to fig. 3), a judgment mode (for example, a "manual snapshot" button and a "continuous tuning on" button in the embodiment corresponding to fig. 3 correspond to two modes respectively), and the like.
YuvService represents a live video stream data processing module of a camera server, and image data used in the judgment process is obtained from the live video stream data processing module; the module is responsible for image processing and video encoding and decoding, and converts images into data which can be processed by an algorithm.
The ImgEvalChain is a core management module of the camera server, and when a request sent by a Client is received, the ImgEvalChain module acquires image data from a Yuvservice module, processes the acquired image data and sends the processed image data to an AlgProcess module for processing; the processing result of the AlgProcessExample and image data of the YuvService are packaged into a request and sent to the ImgEncode module, the ImgEncode module returns to the response after processing the request, the response mainly comprises a diagnostic picture processed by the ImgEncode module, and finally the ImgEvalChain determines an adjustment mode according to the algorithm result through natural semantic conversion (namely step 204 in the embodiment corresponding to the figure 2), generates an optimization proposal, packages the optimization proposal together with the diagnostic picture and returns to the Client terminal. After the Client side obtains the diagnosis result, a request of AutoAdjust can be sent, and the camera is informed to carry out automatic adjustment.
The AlgProcess is a technical core module in the invention, and the AlgProcess module is also finished in the camera measurement, and is mainly responsible for processing data in the image, performing target detection and target scoring (including but not limited to pixel size, target position, target angle and definition), and providing the most basic algorithm result.
The imgnecode is a module in charge of visualization processing of a diagnosis result at a camera server, and integrates basic information of equipment, a snapshot image and an algorithm output result (i.e., an effect parameter of a target) into one picture (for example, the picture shown in fig. 5), and a client gives a result intuitively through the picture, so that a user can obtain feedback.
To sum up, this application embodiment takes a candid photograph the analysis through the target of algorithm in to the scene, gives the analysis result, can quantify the time spent and erect angle and height, reduces the subjective judgement of human factor. Moreover, the embodiment of the application can simultaneously perform snapshot analysis on the face and the license plate, and the application scene of the application is greatly expanded.
Fig. 9 is a schematic diagram of a detection device of a shooting device according to an embodiment of the present application. The apparatus 900 includes:
an obtaining module 901, configured to execute step 201 in each embodiment corresponding to fig. 2;
a processing module 902, configured to execute step 202, step 203, and step 204 in each embodiment corresponding to fig. 2.
Fig. 10 is a schematic diagram of an image capturing apparatus (e.g., a video camera) according to an embodiment of the present application. The image pickup apparatus 1000 may be a photographing apparatus in the architecture corresponding to fig. 1 described above, the image pickup apparatus 1000 including a lens 1001, a sensor 1002, and a processor 1003; the system comprises a lens 1001, a sensor 1002, a light source and a light source, wherein the lens 1001 is used for receiving light rays, and the sensor 1002 is used for performing photoelectric conversion on the light rays received by the lens to generate an image; the processor 1003 is configured to: a method implementing the method embodiment as shown in figure 2.
The lens 1001 functions to present an optical image of the observed object on the sensor of the camera, also known as optical imaging. The lens 1001 combines optical parts (reflecting mirror, transmission mirror, prism) of various shapes and different media (plastic, glass or crystal) in a certain way, so that after the light is transmitted or reflected by the optical parts, the transmission direction of the light is changed according to the needs of people and the light is received by a receiving device, thereby completing the optical imaging process of an object. Generally, each lens 1001 is formed by combining a plurality of groups of lenses with different curvatures at different pitches. The selection of the indexes such as the distance, the curvature of the lens, the light transmission coefficient and the like determines the focal length of the lens. The main parameter indexes of the lens 1001 include: effective focal length, aperture, maximum image plane, field angle, distortion, relative illumination, etc., and various index values determine the overall performance of the lens 1001.
The sensor 1002 (also called an image sensor) is a device for converting an optical image into an electronic signal, and is widely used in digital cameras and other electronic optical devices. Common sensors 1002 include: a charge-coupled device (CCD) and a Complementary Metal Oxide Semiconductor (CMOS). Both CCDs and CMOSs have a large number (e.g., tens of millions) of photodiodes (photodiodes), each referred to as a photosite, each photosite corresponding to a pixel. During exposure, the photodiode receives light and converts the light signal into an electrical signal containing brightness (or brightness and color), and the image is reconstructed accordingly. Bayer array is a common image sensor technology that can be used in CCD and CMOS, where Bayer filters are used to make different pixels sensitive to only one of red, blue, and green primary colors, the pixels are interleaved, and then de-mosaiced to recover the original image. The bayer array may be applied to a CCD or a CMOS, and a sensor to which the bayer array is applied is also called a bayer sensor. In addition to bayer sensors, there are sensor technologies such as X3 (developed by Foveon corporation), and the X3 technology employs three layers of photosensitive elements, each of which records one of the color channels of RGB, so that an image sensor of all colors can be captured on one pixel.
The processor 1003 (also referred to as an image processor) may be a set of a plurality of chips or may be a single chip. The processor 1003 is, for example, a system on chip (SoC). An image processor (ISP) may be included in the processor 1003 for converting the image generated by the sensor into a three channel format (e.g., YUV), improving image quality, and detecting whether a target object is in the image, and also for encoding the image.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (19)

1. A detection method of a photographing apparatus, comprising:
acquiring an image shot by shooting equipment;
detecting a target in the image according to the image;
determining an effect parameter of the target, the effect parameter including a relative position of the target with respect to the photographing apparatus;
and if the effect parameter of the target is not in the preset range, determining an adjusting mode of the relative position of the shooting equipment according to a comparison result of the effect parameter of the target and the preset range.
2. The method of claim 1, wherein the relative position of the target with respect to the capture device comprises angular parameters of the target with respect to the capture device, the angular parameters comprising pitch angle and/or yaw angle and/or roll angle.
3. The method according to claim 2, wherein the determining of the adjustment mode of the relative position of the photographing apparatus according to the comparison result of the effect parameter of the target and the preset range comprises:
if the pitch angle in the angle parameters is not within the preset range, determining one of the adjusting modes as rotating the shooting equipment upwards or downwards;
if the yaw angle in the angle parameter is not within the preset range, determining one of the adjusting modes as rotating the shooting equipment leftwards or rightwards;
and if the roll angle in the angle parameters is not in the preset range, determining one of the adjusting modes as clockwise or anticlockwise rotating the shooting equipment.
4. The method according to any one of claims 1 to 3, wherein the relative position of the object with respect to the photographing apparatus comprises a position parameter of the object in the image, the position parameter being a coordinate of the object in the image.
5. The method according to claim 4, wherein the determining of the adjustment mode of the relative position of the photographing apparatus according to the comparison result of the effect parameter of the target and the preset range comprises:
if the abscissa parameter in the position parameters is not within the preset range, determining one of the adjustment modes as transversely translating the shooting equipment;
and if the ordinate parameter in the position parameters is not in the preset range, determining one of the adjustment modes as longitudinal translation of the shooting equipment.
6. The method of claim 1, wherein the effect parameter comprises a resolution of the target, and the determining the relative position of the photographing apparatus according to the comparison result of the effect parameter of the target and the preset range comprises:
if the resolution of the target in the image is smaller than a first preset threshold, determining one of the adjustment modes as to shorten the distance between the shooting equipment and the shooting object of the shooting equipment or lengthen the focal length of the shooting equipment;
if the resolution of the target in the image is larger than a first preset threshold, determining one of the adjustment modes as to increase the distance between the shooting device and the shooting object thereof or shorten the focal length of the shooting device.
7. The method according to any one of claims 1 to 6, wherein the effect parameter comprises a definition parameter of the target, and the determining of the adjustment mode of the relative position of the shooting device according to the comparison result of the effect parameter of the target and a preset range comprises:
and if the definition parameter of the target is not in the preset range, determining one of the adjusting modes as the focal length of the shooting equipment.
8. The method according to any one of claims 1 to 7, wherein after determining the adjustment mode of the relative position of the shooting device according to the comparison result of the effect parameter of the target and the preset range, the method further comprises:
and adjusting the shooting equipment and/or displaying the adjustment mode according to the adjustment mode.
9. The method according to any one of claims 1 to 8, wherein after determining the effect parameter of the target, before determining the adjustment mode of the relative position of the shooting device according to the comparison result of the effect parameter of the target and the preset range, the method further comprises:
acquiring parameter information of the shooting equipment;
and synthesizing the parameter information of the shooting equipment, the effect parameters of the target and the image into a display image and displaying the display image.
10. A detection apparatus of a photographing device, comprising:
the acquisition module is used for acquiring an image;
the processing module is used for detecting a target in the image according to the image;
the processing module is further configured to determine an effect parameter of the target, where the effect parameter includes a relative position of the target with respect to the photographing apparatus;
and if the effect parameter of the target is within a preset range, the processing module is further configured to determine an adjustment mode of the relative position of the shooting device according to a comparison result of the effect parameter of the target and the preset range.
11. The apparatus of claim 10, wherein the relative position of the target with respect to the capture device comprises angular parameters of the target with respect to the capture device, the angular parameters comprising pitch angle and/or yaw angle and/or roll angle.
12. The apparatus of claim 11, wherein the processing module is further configured to:
if the pitch angle in the angle parameters is not within the preset range, determining one of the adjusting modes as rotating the shooting equipment upwards or downwards;
if the yaw angle in the angle parameter is not within the preset range, determining one of the adjusting modes as rotating the shooting equipment leftwards or rightwards;
and if the roll angle in the angle parameters is not in the preset range, determining one of the adjusting modes as clockwise or anticlockwise rotating the shooting equipment.
13. The apparatus according to any one of claims 10 to 12, wherein the relative position of the object with respect to the photographing device comprises a position parameter of the object in the image, and the position parameter is a coordinate of the center of the object in the image.
14. The apparatus of claim 13, wherein the processing module is further configured to:
if the abscissa parameter in the position parameters is not within the preset range, determining one of the adjustment modes as transversely translating the shooting equipment;
and if the ordinate parameter in the position parameters is not in the preset range, determining one of the adjustment modes as longitudinal translation of the shooting equipment.
15. The apparatus of any of claims 10 to 14, wherein the effect parameter comprises a resolution of the target, and wherein the processing module is further configured to:
if the resolution of the target in the image is smaller than a first preset threshold, determining one of the adjustment modes as to shorten the distance between the shooting equipment and the shooting object of the shooting equipment or lengthen the focal length of the shooting equipment;
if the resolution of the target in the image is larger than a first preset threshold, determining one of the adjustment modes as to increase the distance between the shooting device and the shooting object thereof or shorten the focal length of the shooting device.
16. The apparatus of any one of claims 10 to 15, wherein the effect parameter comprises a sharpness parameter of the object, and wherein the processing module is further configured to:
and if the definition parameter of the target is not in the preset range, determining one of the adjusting modes as the focal length of the shooting equipment.
17. The apparatus of any one of claims 10 to 16, wherein the processing module is further configured to:
and adjusting the shooting equipment and/or displaying the adjustment mode according to the adjustment mode.
18. The apparatus of any one of claims 10 to 17,
the acquisition module is further used for acquiring parameter information of the shooting equipment;
the processing module is further configured to: and synthesizing the parameter information of the shooting equipment, the effect parameters of the target and the image into a display image and displaying the display image.
19. An image pickup apparatus comprising a lens, a sensor, and a processor, wherein:
the lens is used for receiving the light rays,
the sensor is used for performing photoelectric conversion on light received by the lens to generate an image;
the processor is configured to: implementing the method of any one of claims 1 to 9.
CN202110016897.7A 2020-09-22 2021-01-05 Detection method of shooting equipment and related device Pending CN114257732A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/093015 WO2022062421A1 (en) 2020-09-22 2021-05-11 Method for detecting photographing device and related apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011003383X 2020-09-22
CN202011003383 2020-09-22

Publications (1)

Publication Number Publication Date
CN114257732A true CN114257732A (en) 2022-03-29

Family

ID=80790849

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110016897.7A Pending CN114257732A (en) 2020-09-22 2021-01-05 Detection method of shooting equipment and related device

Country Status (2)

Country Link
CN (1) CN114257732A (en)
WO (1) WO2022062421A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013201793A (en) * 2013-07-11 2013-10-03 Nikon Corp Imaging apparatus
CN104735355B (en) * 2015-03-13 2018-01-19 广东欧珀移动通信有限公司 The image capture method and device of a kind of intelligent terminal
CN104917959A (en) * 2015-05-19 2015-09-16 广东欧珀移动通信有限公司 Photographing method and terminal
CN107465869A (en) * 2017-07-27 2017-12-12 努比亚技术有限公司 A kind of focus adjustment method and terminal
CN110719406B (en) * 2019-10-15 2022-06-14 腾讯科技(深圳)有限公司 Shooting processing method, shooting equipment and computer equipment

Also Published As

Publication number Publication date
WO2022062421A1 (en) 2022-03-31

Similar Documents

Publication Publication Date Title
US7705908B2 (en) Imaging method and system for determining camera operating parameter
CN108419023B (en) Method for generating high dynamic range image and related equipment
US8416303B2 (en) Imaging apparatus and imaging method
EP1431912B1 (en) Method and system for determining an area of importance in an archival image
US8558913B2 (en) Capture condition selection from brightness and motion
US8937677B2 (en) Digital photographing apparatus, method of controlling the same, and computer-readable medium
KR100900485B1 (en) Improved image sensing means for digital camera and digital camera adopting the same
KR101822661B1 (en) Vision recognition apparatus and method
US20090109310A1 (en) Imaging device, imaging method, display control device, display control method, and program
KR20150109177A (en) Photographing apparatus, method for controlling the same, and computer-readable recording medium
KR20150078275A (en) Digital Photographing Apparatus And Method For Capturing a Moving Subject
KR101053983B1 (en) Auto focus camera system and control method thereof
CN110771142B (en) Imaging device, method for controlling imaging device, and program for controlling imaging device
CN110930340B (en) Image processing method and device
JP2008017198A (en) Device and method for calculating light source color, and photographing device
CN114257732A (en) Detection method of shooting equipment and related device
JP2023059952A (en) Image processing device, imaging device, image processing method, image processing program, and recording medium
CN113497877A (en) Image pickup apparatus, control method, and storage medium
WO2023245391A1 (en) Preview method and apparatus for camera
JP2004266669A (en) Monitoring camera and image pickup method
JP7571067B2 (en) Image processing device, image processing method, and computer program
JP2008211756A (en) Imaging apparatus
JP5387048B2 (en) Television camera system, television camera control method, and program
CN116338895A (en) Spectral imaging-based device and imaging method
JP2016200742A (en) Imaging apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination