[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110672549B - Imaging method and device - Google Patents

Imaging method and device Download PDF

Info

Publication number
CN110672549B
CN110672549B CN201910954538.9A CN201910954538A CN110672549B CN 110672549 B CN110672549 B CN 110672549B CN 201910954538 A CN201910954538 A CN 201910954538A CN 110672549 B CN110672549 B CN 110672549B
Authority
CN
China
Prior art keywords
imaging
measured
measured object
acquiring
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910954538.9A
Other languages
Chinese (zh)
Other versions
CN110672549A (en
Inventor
邓仕发
潘奕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongtou Huaxun Terahertz Technology Co ltd
Shenzhen Institute of Terahertz Technology and Innovation
Original Assignee
Shenzhen Zhongtou Huaxun Terahertz Technology Co ltd
Shenzhen Institute of Terahertz Technology and Innovation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongtou Huaxun Terahertz Technology Co ltd, Shenzhen Institute of Terahertz Technology and Innovation filed Critical Shenzhen Zhongtou Huaxun Terahertz Technology Co ltd
Priority to CN201910954538.9A priority Critical patent/CN110672549B/en
Publication of CN110672549A publication Critical patent/CN110672549A/en
Application granted granted Critical
Publication of CN110672549B publication Critical patent/CN110672549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/31Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry
    • G01N21/35Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light
    • G01N21/3581Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using far infrared light; using Terahertz radiation
    • G01N21/3586Investigating relative effect of material at wavelengths characteristic of specific elements or molecules, e.g. atomic absorption spectrometry using infrared light using far infrared light; using Terahertz radiation by Terahertz time domain spectroscopy [THz-TDS]

Landscapes

  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The application is applicable to the technical field of terahertz wave imaging, and provides an imaging method and device, which comprise the following steps: acquiring position information of a measured object at a first imaging position, and acquiring a three-dimensional position model of the measured object according to the position information; adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position; the method comprises the steps of obtaining a terahertz wave signal reflected by a measured object at a second imaging position, carrying out three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object, adjusting the position of an imaging device after obtaining position information of the measured object, obtaining a corresponding terahertz wave signal to generate the three-dimensional image of the measured object, and setting the first imaging position and the second imaging position according to a preset motion track of the imaging device, so that the position information and the terahertz wave signal of the measured object can be obtained in an all-round manner, and the three-dimensional imaging of the measured object is further realized.

Description

Imaging method and device
Technical Field
The application belongs to the technical field of terahertz wave imaging, and particularly relates to an imaging method and device.
Background
Terahertz waves are a frequency band of electromagnetic waves between infrared and microwave, and 0.1 to 10THz is generally defined as "THz Gap", i.e., a terahertz band. The electromagnetic wave of the wave band has the characteristics of both the directivity of visible light and the penetrating capability of microwaves, has short wavelength, does not generate ionizing radiation, contains rich spectral information and the like, and can better obtain a three-dimensional image of a measured object if (3D) three-dimensional information of the measured object can be obtained on the basis of planar terahertz wave imaging. Therefore, how to perform omnidirectional three-dimensional imaging on a measured object by using terahertz waves is a problem which needs to be solved urgently at present.
Disclosure of Invention
The embodiment of the application provides an imaging method and device, which can solve the problem of how to carry out omnibearing three-dimensional imaging on a measured object by utilizing terahertz waves.
In a first aspect, an embodiment of the present application provides an imaging method, which is applied to an imaging apparatus, and the imaging method includes:
acquiring position information of a measured object at a first imaging position, and acquiring a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device;
adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position;
and acquiring a terahertz wave signal reflected by the measured object at a second imaging position, and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object.
It should be understood that the preset motion track is a motion track capable of scanning the measured object in all directions.
It should be understood that the preset motion trajectory is provided with a plurality of first imaging positions, and the position information of the object to be measured can be determined omnidirectionally by acquiring the position information of the object to be measured through the plurality of first imaging positions.
It should be understood that the number of the second imaging positions is the same as that of the first imaging positions, and each first imaging position is adjusted according to the three-dimensional position model, so that the second imaging position corresponding to each first imaging position is obtained.
In a possible implementation manner of the first aspect, before obtaining the position information of the measured object at the first imaging position and obtaining the three-dimensional position model of the measured object according to the position information, the method further includes:
and adjusting the imaging device to be in an initial posture.
Furthermore, the imaging device comprises a mechanical arm, an imaging probe, a horizontal turntable and a support frame, wherein the horizontal turntable is arranged in a tiled mode, one end of the mechanical arm is connected with the horizontal turntable, and the other end of the mechanical arm is connected with the imaging probe;
the adjusting the imaging device to be in an initial posture comprises:
controlling the horizontal rotary table to move to a first initial position;
and controlling the mechanical arm to move and adjusting the imaging probe to enable the vertical distance from the imaging probe to the identification dot of the support frame to be a preset value.
Further, the acquiring the position information of the object to be measured at the first imaging position and acquiring the three-dimensional position model of the object to be measured according to the position information includes:
controlling the horizontal rotary table to move to a first imaging station along a first scanning track;
controlling the mechanical arm to drive the imaging probe to move to each first imaging position along a second scanning track;
controlling the imaging probe to obtain the distance from the surface of the measured object to the imaging probe at each first imaging position;
and drawing the three-dimensional position model according to the distance from the surface of the measured object to the imaging probe.
Further, the adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position includes:
adjusting the distance between the imaging probe and the supporting table according to the distance between the surface of the measured object and the imaging probe and the focal length of the imaging probe;
and determining the surface inclination of the measured object according to the three-dimensional position model, and adjusting the imaging probe to a second imaging position vertical to the surface of the measured object.
Further, after the acquiring the position information of the object to be measured at the first imaging position and acquiring the three-dimensional position model of the object to be measured according to the position information, the method further includes:
and acquiring a visible light image of the measured object at the first imaging position.
Further, after the obtaining of the terahertz wave signal reflected by the object to be measured at the second imaging position and the three-dimensional image reconstruction based on the terahertz wave signal are performed to obtain the three-dimensional image of the object to be measured, the method further includes:
judging whether the terahertz wave signal is blocked;
if the terahertz wave signal is blocked, identifying a second imaging position where the terahertz wave signal is blocked;
acquiring a corresponding visible light image according to the identified second imaging position;
and correcting the three-dimensional image of the measured object according to the visible light image.
In a second aspect, an embodiment of the present application provides an imaging apparatus, including:
the position acquisition module is used for acquiring position information of a measured object at a first imaging position and acquiring a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device;
the adjusting module is used for adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position;
and the reconstruction module is used for acquiring the terahertz wave signal reflected by the measured object at a second imaging position, and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the imaging method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of the imaging method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which when run on a terminal device, causes the terminal device to execute the imaging method according to any one of the first aspect.
It is to be understood that, for the beneficial effects of the second aspect to the fifth aspect, reference may be made to the relevant description in the first aspect, and details are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that: the position information of the object to be measured is acquired firstly, then the position of the imaging device is adjusted, and then the corresponding terahertz wave signal is acquired to generate a three-dimensional image of the object to be measured, and the first imaging position and the second imaging position are set according to the preset motion track of the imaging device, so that the position information and the terahertz wave signal of the object to be measured can be acquired in all directions, and then the all-directional three-dimensional imaging of the object to be measured is realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of an imaging method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an imaging apparatus to which an imaging method provided in an embodiment of the present application is applied;
fig. 3 is a schematic structural diagram of an imaging probe of an imaging apparatus to which an imaging method provided by an embodiment of the present application is applied;
FIG. 4 is an exemplary schematic diagram of a three-dimensional position model of an imaging method provided by an embodiment of the present application;
fig. 5 is a schematic diagram of a preset motion trajectory in an imaging method according to an embodiment of the present application;
FIG. 6 is a schematic flow chart illustrating an implementation of an imaging method according to another embodiment of the present application;
fig. 7 is a schematic diagram of an imaging apparatus in an initial pose in an imaging method according to another embodiment of the present application;
fig. 8 is a diagram illustrating an example in which a terahertz wave signal is blocked in an imaging method according to another embodiment of the present application;
fig. 9 is a flowchart illustrating an implementation of step S102 of an imaging method according to another embodiment of the present application;
fig. 10 is a diagram illustrating an application scenario of step S102 of an imaging method according to another embodiment of the present application;
fig. 11 is a schematic structural diagram of an imaging apparatus provided in an embodiment of the present application;
fig. 12 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The imaging method provided by the embodiment of the application can be applied to terminal devices such as a mobile phone, a tablet personal computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), and the like, and the embodiment of the application does not limit the specific type of the terminal device at all.
For example, the terminal device may be a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA) device, a handheld device with Wireless communication capability, a computing device or other processing device connected to a Wireless modem, a vehicle-mounted device, a vehicle-networking terminal, a computer, a laptop, a handheld communication device, a handheld computing device, a satellite Wireless device, a Wireless modem card, a Set Top Box (STB), a Customer Premises Equipment (CPE), and/or other devices for communicating over a Wireless system and a next generation communication system, such as a Mobile terminal in a 5G Network or a future-evolved Public Land Mobile Network (Public Mobile Network, PLMN) mobile terminals in the network, etc.
Referring to fig. 1, an embodiment of the present application provides an imaging method applied to an imaging apparatus 100, the imaging method including:
step S101: acquiring position information of a measured object at a first imaging position, and acquiring a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device.
Specifically, referring to fig. 2, fig. 2 shows a schematic structural diagram of an imaging device for executing the imaging method provided in this embodiment. As shown in fig. 2, in the present embodiment, the above-described imaging apparatus 100 includes: an imaging probe 30, a horizontal turret 10, a robotic arm 20, and a support frame (not numbered in the figures). The imaging probe 30 is configured to emit a terahertz wave to the object to be measured 40 and receive the terahertz wave reflected from the object to be measured 40, and further, the imaging probe 30 is also configured to take an image of the object to be measured 40. Further, the imaging probe 30 is also used for measuring the distance between the imaging probe and the surface of the object to be measured. The horizontal rotary table 10 is laid flat and rotates around the circumference of the object to be measured 40 along the first scanning track. Illustratively, the first scanning trajectory is a circle, and a projection of the object under test 40 on a plane defined by the first scanning trajectory is located at a center of the first scanning trajectory. One end of the mechanical arm 20 is connected with the horizontal turntable 10, the other end of the mechanical arm 20 is connected with the imaging probe 30, and the mechanical arm 20 drives the imaging probe 30 to move along a second scanning track 50 and rotate around the circumference of the measured object 40. Similarly, the second scanning trajectory 50 is circular, and the projection of the object to be measured 40 on the plane defined by the second scanning trajectory 50 is located at the center of the second scanning trajectory 50. It is understood that a plurality of second scanning tracks 50 are provided, and the imaging probe 30 moves along each second scanning track 50 separately and sequentially during the movement of the horizontal turret 10 along the first scanning track, thereby performing an omnidirectional spherical stereoscopic imaging on the object 40 to be measured. The centers of the second scanning tracks 50 are overlapped, and the second scanning tracks 50 are spatially combined into a scanning ball 51. The plane defined by each second scanning trajectory 50 is perpendicular to the plane defined by the first scanning trajectory. A plurality of first imaging stations are arranged on a first scanning track at intervals, and each first imaging station is correspondingly provided with one second scanning track 50; the imaging probe 30 rotates around the object to be measured 40 along the corresponding second scanning track 50 at each of the first imaging stations. It will be appreciated that increasing the number of first imaging stations increases the imaging accuracy of the imaging probe 30. The position of the imaging probe 30 in the horizontal plane is adjusted and set by the horizontal turret 10 moving along the first scanning trajectory. The mechanical arm 20 drives the imaging probe 30 to move along each second scanning track 50 in sequence, so as to perform all-directional scanning on the object to be measured 40. A plurality of first imaging positions are disposed at intervals on the second scanning trajectory 50, and the imaging probe 30 performs scanning imaging on the object to be measured 40 at the first imaging positions. It will be appreciated that increasing the number of first imaging positions may improve the imaging accuracy of the imaging probe 30. It can be understood that the preset motion tracks are the first scanning track and the second scanning track.
Specifically, the distance between the imaging probe and the surface of the measured object is measured by controlling the imaging probe at the first imaging position, and then the position information of the measured object is determined.
In the present embodiment, as shown in fig. 3, the above-mentioned imaging probe 30 includes a transmission probe 31 that transmits the terahertz wave to the object to be measured 40, a reception probe 32 that receives the terahertz wave reflected by the object to be measured 40, an imager 33 (which may include a visible light imager and a terahertz wave imager) for imaging the object to be measured, and a distance meter 34 connected to the imager 33. The distance measuring device 34 is used for detecting the shooting distance of the imager 33 to the measured object 40. Optionally, the frequency range of the terahertz waves emitted from the emission probe 31 to the object to be measured 40 is 0.1 to 10 THz. The imaging probe 30 also includes a cable consisting of optical fibers, signal lines, power lines, etc., to enable connection of the imaging probe 30 to an external power source and server. Optionally, the distance measuring device 34 is a laser distance measuring device.
Optionally, the imager 33 photographs in the imaging zone 35, the laser range finder measures the distance between the imager and the object to be measured 40, the transmitting probe 31 and the receiving probe 32 have a focusing point 36 on the surface of the object to be measured 40, the distance between the imager 33 and the focusing point 36 is set to be D, and D is also the focal length of the imager 33. During scanning, real-time measurements are taken by the laser rangefinder and, when the real-time monitored value deviates from the set value, the robotic arm 20 adjusts so that the distance D from the imager 33 to the focal point 36 remains constant.
Specifically, the distance between the imager 33 and the surface of the measured object is obtained by the distance meter 34 at the first imaging position, and since the distance between the imaging probe at the first imaging position and the center of the preset moving track is fixed, the position information of the measured object at the first imaging position can be obtained according to the distance between the imager 33 and the surface of the measured object.
Specifically, a three-dimensional position model of the measured object can be constructed according to the position information obtained from each first imaging position, as shown in fig. 4, which represents the three-dimensional position model of the measured object, wherein each point represents a first imaging position, and the distance W (i.e., the position information) from the measured object surface at the first imaging position to the center of the preset trajectory is the difference between the distance H from the imaging probe to the center of the preset trajectory and the distance T from the imaging probe to the measured object surface, that is, W is H-T. From the W value of each first imaging position, a three-dimensional position model as shown in fig. 4 can be generated.
In one embodiment, the step S101 includes the following steps:
controlling the horizontal rotary table to move to a first imaging station along a first scanning track;
controlling the mechanical arm to drive the imaging probe to move to each first imaging position along a second scanning track;
controlling the imaging probe to obtain the distance from the surface of the measured object to the imaging probe at each first imaging position;
and drawing the three-dimensional position model according to the distance from the surface of the measured object to the imaging probe.
Specifically, referring to fig. 5, each second scanning trajectory 50 of the robot arm 20 and the first scanning trajectory of the horizontal turntable 10 are preset. The mechanical arm 20 drives the imaging probe 30 to complete scanning of a second scanning track 50 at each first imaging position, the horizontal turntable 10 advances to a first imaging station, then the imaging probe 30 performs scanning again, and the above processes are repeated until the whole scanning is completed, in the scanning process, the center of the moving track of the imaging probe 30 is kept unchanged, and each second scanning track 50 forms a spherical scanning ball 51. And constructing a three-dimensional position model of the measured object according to the distance between the surface of the measured object and the imaging probe obtained at each first imaging position.
Step S102: and adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position.
Specifically, after the three-dimensional position model of the measured object is determined, the first imaging position is adjusted according to the three-dimensional position model, and the imaging probe is controlled to be adjusted to the second imaging position.
Specifically, after position information of all the first imaging positions is acquired, the imaging device is controlled to return to the initial posture, and then the object to be measured is scanned again by the terahertz waves. And the imaging positions at this time are the second imaging positions. And adjusting each first imaging position to enable the imaging probe to be perpendicular to the surface of the measured object and the distance between the imaging probe and the surface of the measured object to be equal to H. By adjusting the first imaging position, the object to be measured is scanned again at the second imaging position, and the surface characteristic information of the object to be measured can be obtained.
Step S103: and acquiring a terahertz wave signal reflected by the measured object at a second imaging position, and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object.
It can be understood that the terahertz wave received from the object to be measured 40 includes characteristic information related to the object to be measured 40, and the imaging probe transmits the fed-back terahertz wave signal to a server in communication connection therewith and processes the signal by the server, so as to perform stereoscopic imaging on the object to be measured 40.
Specifically, three-dimensional image reconstruction is realized by using a terahertz wave signal fed back at a second imaging position through methods such as terahertz computer-aided tomography, terahertz digital holography, terahertz diffraction tomography, terahertz tomography and the like, so as to obtain a three-dimensional image corresponding to the object to be measured.
In one embodiment, after the two scans are completed, the imaging apparatus is controlled to return to the initial posture again, and the imaging process is ended.
As shown in fig. 6, in another embodiment, the imaging method further includes, before step S101:
step S104: and adjusting the imaging device to be in an initial posture.
Specifically, the imaging device is controlled to return to the initial posture, and then the imaging device is controlled to obtain the position information of the measured object at each first imaging position, so that the vertical distance between an imaging probe of the imaging device and an identification dot of the support frame is a preset value. In the present embodiment, the preset value is equal to the distance D from the imager to the focus point.
In one embodiment, the step S104 includes the following steps:
controlling the horizontal rotary table to move to a first initial position;
and controlling the mechanical arm to move and adjusting the imaging probe to enable the vertical distance from the imaging probe to the identification dot of the support frame to be a preset value.
Specifically, the imager of the imaging probe comprises a camera, the camera of the imaging probe is used for acquiring images after the horizontal turntable is controlled to move to a first initial position, the mechanical arm is moved to a marking circular point on the support frame through a control program, and the imaging probe 30 is adjusted by the camera and the laser range finder through a calibration program by the mechanical arm because the marking circular point is located on the support frame, so that the vertical distance from the imaging probe 30 to the marking circular point is a preset value, and the preset value is equal to D. Since the marking dots are standard circles and are fixed in shape and size, real-time images of the marking dots acquired by the camera are compared with the positions of the standard marking dots, the mechanical arm 20 is adjusted, and scanning initialization is accurately performed. The first initial position is a position point on the first scanning track, and may be set according to a requirement, which is not limited herein.
Referring to fig. 7, a solid line circle 72 is a real-time image of the marker dots collected by the camera, a dotted line circle 71 is a standard marker dot drawn and located at the center of the display screen, and the solid line circle and the dotted line circle are overlapped by adjusting the movement of the robot arm 20, that is, the imaging device is in the initial pose.
In one embodiment, after step S101, the imaging method further includes:
and acquiring a visible light image of the measured object at the first imaging position.
Specifically, a visible light image of the measured object is acquired by a camera of the imaging probe.
In one embodiment, the camera is a CCD camera, and the CCD camera takes a picture of the measured object 40 and transmits the taken optical picture to a server for processing. And correcting the three-dimensional image obtained according to the reconstruction of the terahertz wave signal by combining the visible light image, so as to obtain a more accurate three-dimensional image of the measured object.
Specifically, the position information of the object to be measured is acquired at the first imaging position, the visible light image of the object to be measured is acquired at the same time, and the visible light image acquired at each first imaging position is spliced to obtain a complete visible light image of the object to be measured.
In an embodiment, before the step S103, the method further includes the following steps:
judging whether the terahertz wave signal is blocked;
if the terahertz wave signal is blocked, identifying a second imaging position where the terahertz wave signal is blocked;
acquiring a corresponding visible light image according to the identified second imaging position;
and correcting the three-dimensional image of the measured object according to the visible light image.
In a specific application, the transmitting probe and the receiving probe are arranged in a mirror image mode relative to the imager, and an included angle between the axial direction of the transmitting probe and the axial direction of the receiving probe is set to be a preset value. Alternatively, the included angle is denoted as θ. As shown in fig. 8, the angle between the terahertz-wave transmitting probe 7 and the terahertz-wave receiving probe 8 is θ, when the normal transmission of the terahertz light path is affected when a deeper groove or other structures appear on the surface of the sample, at the moment, because the transmission path of the terahertz information is blocked, therefore, the terahertz wave receiving probe cannot receive the terahertz wave, and the imaging position of the imaging probe needs to be adjusted, so that the terahertz wave receiving probe can receive the terahertz wave signal, but due to blocking, the terahertz wave signal at the imaging position can not completely reflect the surface characteristics of the measured object, therefore, the second imaging position is marked, the visible light image of the position is found based on the first imaging position corresponding to the second imaging position, the reconstructed three-dimensional image is corrected by combining the visible light image, and a more accurate three-dimensional image of the measured object is obtained.
Referring to fig. 9, fig. 9 is a flowchart illustrating an implementation of S102 in an imaging method according to another embodiment of the present application. The difference between the present embodiment and the previous embodiment is that the imaging method provided in the present embodiment includes the following steps, which are detailed as follows:
step S201: and adjusting the distance between the imaging probe and the supporting table according to the distance from the surface of the measured object to the imaging probe and the focal length of the imaging probe.
Step S202: and determining the surface inclination of the measured object according to the three-dimensional position model, and adjusting the imaging probe to a second imaging position vertical to the surface of the measured object.
Specifically, as shown in fig. 10, 19, 20, and 21 are three measurement points in succession on the surface of the measured object during spherical scanning. And 3, an imaging probe on the mechanical arm, wherein the dotted line is the motion track of the imaging probe during scanning in the step S101, the dotted line points to the center 18 of the sphere, the distance H from the center of the sphere is fixed, and the solid line is the position of the adjusted surface 20 of the measured object.
Taking the terahertz wave signal of the measurement point 20 on the surface of the measured object as an example, firstly, according to the previous measurement point 19 and the next measurement point 21, the direction perpendicular to the connection line of the previous measurement point 19 and the next measurement point 21 is found in the three-dimensional position model formed during the scanning in step S101, the angle of the mechanical arm is adjusted, the position of the mechanical arm is made to be perpendicular to the connection line between the previous measurement point 19 and the next measurement point 21 in the three-dimensional position model, and after the angle adjustment is completed, the focal length adjustment is performed.
Since the focal length of the terahertz probe is D and the distance from the probe to the sample surface measured during scanning in step S101 is T, the adjustment distance Δ D of the mechanical arm is:
ΔD=T-D
if Δ D is a positive number, the robot arm moves toward a position close to 20 by a distance of | Δ D | and if Δ D is a negative number, the robot arm moves toward a position away from 20 by a distance of | Δ D | and the position after the final adjustment is completed is shown by a solid line 3 in fig. 10.
According to the imaging method provided by the embodiment of the application, the position information of the object to be measured is firstly acquired, then the position of the imaging device is adjusted, and then the corresponding terahertz wave signal is acquired to generate the three-dimensional image of the object to be measured, and the first imaging position and the second imaging position are set according to the preset motion track of the imaging device, so that the position information and the terahertz wave signal of the object to be measured can be acquired in all directions, and further the all-around three-dimensional imaging of the object to be measured is realized.
Fig. 11 shows a block diagram of the structure of the imaging apparatus provided in the embodiment of the present application, corresponding to the imaging method described in the above embodiment, and only the part related to the embodiment of the present application is shown for convenience of explanation.
Referring to fig. 11, the imaging apparatus includes a position acquisition module 101, an adjustment module 102, and an imaging module 103.
The position acquisition module 101 is configured to acquire position information of a measured object at a first imaging position, and acquire a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device.
The adjusting module 102 is configured to adjust the first imaging position according to the three-dimensional position model of the measured object, so as to obtain a second imaging position.
The reconstruction module 103 is configured to acquire a terahertz wave signal reflected by the object to be measured at a second imaging position, and perform three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the object to be measured.
In one embodiment, the imaging device further comprises an initialization module.
The initialization module is used for adjusting the imaging device to be in an initial posture.
In one embodiment, the initialization module includes a first initialization unit and a second initialization unit.
The first initialization unit is used for controlling the horizontal rotary table to move to a first initial position.
The second initialization unit is used for controlling the mechanical arm to move and adjusting the imaging probe, so that the vertical distance from the imaging probe to the identification dot of the support frame is a preset value.
In one embodiment, the position obtaining module 101 includes a first control unit, a second control unit, a third control unit, and a rendering unit.
The first control unit is used for controlling the horizontal rotary table to move to the first imaging station along the first scanning track.
The second control unit is used for controlling the mechanical arm to drive the imaging probe to move to each first imaging position along a second scanning track.
The third control unit is used for controlling the imaging probe to acquire the distance from the surface of the measured object to the imaging probe at each first imaging position.
The drawing unit is used for drawing the three-dimensional position model according to the distance from the surface of the measured object to the imaging probe.
In one embodiment, the adjusting module 102 includes a focus adjusting unit and an angle adjusting unit.
The focal length adjusting unit is used for adjusting the distance between the imaging probe and the supporting table according to the distance between the surface of the object to be measured and the imaging probe and the focal length of the imaging probe.
The angle adjusting unit is used for determining the surface inclination of the measured object according to the three-dimensional position model and adjusting the imaging probe to a second imaging position perpendicular to the surface of the measured object.
In one embodiment, the imaging device further comprises a visible light image acquisition module.
The visible light image acquisition module is used for acquiring a visible light image of the measured object at the first imaging position.
In one embodiment, the imaging device further includes a judging module, an identifying module, a correspondence obtaining module, and a correcting module.
The judging module is used for judging whether the terahertz wave signal is blocked.
The identification module is used for identifying a second imaging position where the terahertz wave signal is blocked if the terahertz wave signal is blocked.
The corresponding acquisition module is used for acquiring a corresponding visible light image according to the identified second imaging position.
The correction module is used for correcting the three-dimensional image of the measured object according to the visible light image.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
Therefore, the imaging device provided by this embodiment can also generate a three-dimensional image of the object to be measured by acquiring the position information of the object to be measured and then adjusting the position of the imaging device, and then acquiring the corresponding terahertz wave signal, and the first imaging position and the second imaging position are set according to the preset motion trajectory of the imaging device, so that the position information of the object to be measured and the terahertz wave signal can be acquired in all directions, and then the object to be measured can be subjected to all-directional three-dimensional imaging.
Fig. 12 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 12, the terminal device 12 of this embodiment includes: at least one processor 120 (only one shown in fig. 12), a memory 121, and a computer program 122 stored in the memory 121 and executable on the at least one processor 120, the processor 120 implementing the steps in any of the various imaging method embodiments described above when executing the computer program 122.
The terminal device 12 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 120, a memory 121. Those skilled in the art will appreciate that fig. 12 is merely an example of terminal device 12 and does not constitute a limitation on terminal device 12, and may include more or less components than those shown, or some components in combination, or different components, such as input output devices, network access devices, etc.
The Processor 120 may be a Central Processing Unit (CPU), and the Processor 120 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 121 may be an internal storage unit of the terminal device 12 in some embodiments, for example, a hard disk or a memory of the terminal device 12. The memory 121 may also be an external storage device of the terminal device 12 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 12. Further, the memory 121 may also include both an internal storage unit and an external storage device of the terminal device 12. The memory 121 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer programs. The memory 121 may also be used to temporarily store data that has been output or is to be output.
Illustratively, the computer program 122 may be divided into one or more units, which are stored in the memory 121 and executed by the processor 120 to accomplish the present application. The one or more units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 122 in the terminal device 12. For example, the computer program 122 may be divided into a position acquisition block, an adjustment module, and a reconstruction module, and each unit has the following specific functions:
the position acquisition module is used for acquiring position information of a measured object at a first imaging position and acquiring a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device;
the adjusting module is used for adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position;
and the reconstruction module is used for acquiring the terahertz wave signal reflected by the measured object at a second imaging position, and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-drive, a removable hard drive, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An imaging method applied to an imaging apparatus, the imaging method comprising:
acquiring position information of a measured object at a first imaging position, and acquiring a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device;
adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position;
and acquiring a terahertz wave signal reflected by the measured object at a second imaging position, and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object.
2. The imaging method according to claim 1, wherein before acquiring the position information of the object to be measured at the first imaging position and acquiring the three-dimensional position model of the object to be measured based on the position information, the method further comprises:
and adjusting the imaging device to be in an initial posture.
3. The imaging method of claim 2, wherein the imaging device comprises a mechanical arm, an imaging probe, a horizontal turntable arranged in a tiled manner, and a support frame, wherein one end of the mechanical arm is connected with the horizontal turntable, and the other end of the mechanical arm is connected with the imaging probe;
the adjusting the imaging device to be in an initial posture comprises:
controlling the horizontal rotary table to move to a first initial position;
and controlling the mechanical arm to move and adjusting the imaging probe to enable the vertical distance from the imaging probe to the identification dot of the support frame to be a preset value.
4. The imaging method according to claim 3, wherein the acquiring position information of the object to be measured at the first imaging position and acquiring a three-dimensional position model of the object to be measured based on the position information includes:
controlling the horizontal rotary table to move to a first imaging station along a first scanning track;
controlling the mechanical arm to drive the imaging probe to move to each first imaging position along a second scanning track;
controlling the imaging probe to obtain the distance from the surface of the measured object to the imaging probe at each first imaging position;
and drawing the three-dimensional position model according to the distance from the surface of the measured object to the imaging probe.
5. The imaging method of claim 4, wherein said adjusting the first imaging position based on the three-dimensional position model of the object under test to obtain a second imaging position comprises:
adjusting the distance between the imaging probe and the supporting table according to the distance from the surface of the measured object to the imaging probe and the focal length of the imaging probe;
and determining the surface inclination of the measured object according to the three-dimensional position model, and adjusting the imaging probe to a second imaging position vertical to the surface of the measured object.
6. The imaging method according to claim 1, further comprising, after the acquiring position information of the object to be measured at the first imaging position and acquiring a three-dimensional position model of the object to be measured based on the position information:
and acquiring a visible light image of the measured object at the first imaging position.
7. The imaging method according to claim 6, wherein after the acquiring the terahertz wave signal reflected by the object to be measured at the second imaging position and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the object to be measured, the method further comprises:
judging whether the terahertz wave signal is blocked;
if the terahertz wave signal is blocked, identifying a second imaging position where the terahertz wave signal is blocked;
acquiring a corresponding visible light image according to the identified second imaging position;
and correcting the three-dimensional image of the measured object according to the visible light image.
8. An image forming apparatus, comprising:
the position acquisition module is used for acquiring position information of a measured object at a first imaging position and acquiring a three-dimensional position model of the measured object according to the position information; the first imaging position is set according to a preset motion track of the imaging device;
the adjusting module is used for adjusting the first imaging position according to the three-dimensional position model of the measured object to obtain a second imaging position;
and the reconstruction module is used for acquiring the terahertz wave signal reflected by the measured object at a second imaging position, and performing three-dimensional image reconstruction based on the terahertz wave signal to obtain a three-dimensional image of the measured object.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN201910954538.9A 2019-10-09 2019-10-09 Imaging method and device Active CN110672549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910954538.9A CN110672549B (en) 2019-10-09 2019-10-09 Imaging method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910954538.9A CN110672549B (en) 2019-10-09 2019-10-09 Imaging method and device

Publications (2)

Publication Number Publication Date
CN110672549A CN110672549A (en) 2020-01-10
CN110672549B true CN110672549B (en) 2022-08-09

Family

ID=69081157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910954538.9A Active CN110672549B (en) 2019-10-09 2019-10-09 Imaging method and device

Country Status (1)

Country Link
CN (1) CN110672549B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020133704B4 (en) * 2020-12-16 2022-07-07 CiTEX Holding GmbH THz measuring device and THz measuring method for measuring a corrugated pipe
CN114062302A (en) * 2021-09-27 2022-02-18 国网河北省电力有限公司雄安新区供电公司 Distribution network autonomous inspection method for terahertz imaging detection
CN114252395A (en) * 2021-12-15 2022-03-29 中国科学院深圳先进技术研究院 Terahertz measuring device and system
CN114777676B (en) * 2022-05-11 2023-07-04 青岛盛瀚色谱技术有限公司 Self-adaptive terahertz three-dimensional tomography device and method
CN115079167A (en) * 2022-05-23 2022-09-20 电子科技大学 Terahertz continuous wave three-dimensional tomography device and method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105158196A (en) * 2015-05-12 2015-12-16 上海理工大学 Terahertz wave 3D image acquisition method and system
CN107510466A (en) * 2016-06-15 2017-12-26 中慧医学成像有限公司 A kind of three-D imaging method and system
CN107631995A (en) * 2016-07-18 2018-01-26 华中科技大学 A kind of three-dimensional terahertz tomographic imaging system and scanning and image rebuilding method
CN107689072A (en) * 2016-06-12 2018-02-13 中慧医学成像有限公司 A kind of 3-D view imaging method and system
CN109142267A (en) * 2018-09-07 2019-01-04 北京华航无线电测量研究所 A kind of real-time terahertz imaging device and method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130146770A1 (en) * 2011-12-08 2013-06-13 Electronics And Telecommunications Research Institute Terahertz continuous wave system and method of obtaining three-dimensional image thereof
CN109276229B (en) * 2018-08-15 2022-04-15 华中科技大学苏州脑空间信息研究院 Rapid focusing system and method for photoacoustic microscopic imaging
CN109300823B (en) * 2018-11-28 2021-02-09 德淮半导体有限公司 Ultrasonic scanning system and method for ultrasonic scanning of wafer

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105158196A (en) * 2015-05-12 2015-12-16 上海理工大学 Terahertz wave 3D image acquisition method and system
CN107689072A (en) * 2016-06-12 2018-02-13 中慧医学成像有限公司 A kind of 3-D view imaging method and system
CN107510466A (en) * 2016-06-15 2017-12-26 中慧医学成像有限公司 A kind of three-D imaging method and system
CN107631995A (en) * 2016-07-18 2018-01-26 华中科技大学 A kind of three-dimensional terahertz tomographic imaging system and scanning and image rebuilding method
CN109142267A (en) * 2018-09-07 2019-01-04 北京华航无线电测量研究所 A kind of real-time terahertz imaging device and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Fast Three-Dimensional Image Reconstruction of a Standoff Screening System in the Terahertz Regime;Gao jingkun等;《IEEE》;20180131;第8卷(第1期);第38-51页 *
太赫兹计算机辅助层析成像发展近况;李运达 等;《激光与红外》;20121231;第42卷(第1期);第1372-1375页 *

Also Published As

Publication number Publication date
CN110672549A (en) 2020-01-10

Similar Documents

Publication Publication Date Title
CN110672549B (en) Imaging method and device
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
US20170365068A1 (en) Combining light-field data with active depth data for depth map generation
CN108174180A (en) A kind of display device, display system and 3 D displaying method
CN108007344B (en) Method, storage medium and measuring system for visually representing scan data
CN113330487A (en) Parameter calibration method and device
EP3988895B1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
US20240087167A1 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
US11175568B2 (en) Information processing apparatus, information processing method, and program as well as in interchangeable lens
US20240371020A1 (en) Systems and methods of measuring an object in a scene of a captured image
JP2010176325A (en) Device and method for generating optional viewpoint image
CN108550171A (en) The line-scan digital camera scaling method containing Eight Diagrams coding information based on Cross ration invariability
CN109084679B (en) A kind of 3D measurement and acquisition device based on spatial light modulator
CN106374224A (en) Electromagnetic wave imaging system and antenna array signal correction method
Ifthekhar et al. Radiometric and geometric camera model for optical camera communications
US20240244330A1 (en) Systems and methods for capturing and generating panoramic three-dimensional models and images
CN211292584U (en) Imaging device
CN109682312B (en) Method and device for measuring length based on camera
Langmann Wide area 2D/3D imaging: development, analysis and applications
US20170371035A1 (en) Protection and guidance gear or equipment with identity code and ip address
CN112630750B (en) Sensor calibration method and sensor calibration device
CN117213373A (en) Three-dimensional point cloud acquisition method
WO2022183906A1 (en) Imaging method and apparatus, device, and storage medium
CN111210471B (en) Positioning method, device and system
CN113340433A (en) Temperature measuring method, temperature measuring device, storage medium, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200915

Address after: 518000 Guangdong Province, Baoan District Xixiang street Shenzhen City Tian Yi Lu Chen Tian Bao Industrial District thirty-seventh building 430 room

Applicant after: Shenzhen Zhongtou Huaxun Terahertz Technology Co.,Ltd.

Applicant after: SHENZHEN INSTITUTE OF TERAHERTZ TECHNOLOGY AND INNOVATION Co.,Ltd.

Address before: 518000 Guangdong Province, Baoan District Xixiang street Shenzhen City Tian Yi Lu Chen Tian Bao Industrial District 37 Building 2 floor East

Applicant before: Shenzhen Terahertz System Equipment Co.,Ltd.

Applicant before: SHENZHEN INSTITUTE OF TERAHERTZ TECHNOLOGY AND INNOVATION Co.,Ltd.

GR01 Patent grant
GR01 Patent grant