CN110390698B - Gray scale sensor module, data processing method thereof and robot - Google Patents
Gray scale sensor module, data processing method thereof and robot Download PDFInfo
- Publication number
- CN110390698B CN110390698B CN201910542122.6A CN201910542122A CN110390698B CN 110390698 B CN110390698 B CN 110390698B CN 201910542122 A CN201910542122 A CN 201910542122A CN 110390698 B CN110390698 B CN 110390698B
- Authority
- CN
- China
- Prior art keywords
- gray value
- value
- real
- preset
- difference
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 22
- 238000001514 detection method Methods 0.000 claims abstract description 71
- 238000000034 method Methods 0.000 claims abstract description 46
- 230000008569 process Effects 0.000 claims abstract description 32
- 239000003086 colorant Substances 0.000 claims abstract description 25
- 238000004590 computer program Methods 0.000 claims description 20
- 230000006870 function Effects 0.000 claims description 16
- 238000004422 calculation algorithm Methods 0.000 claims description 9
- 238000004364 calculation method Methods 0.000 claims description 3
- 239000000463 material Substances 0.000 abstract description 9
- 230000008859 change Effects 0.000 abstract description 7
- 238000004519 manufacturing process Methods 0.000 abstract description 7
- 238000012545 processing Methods 0.000 description 8
- 238000003860 storage Methods 0.000 description 7
- 230000003044 adaptive effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000037303 wrinkles Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30144—Printing quality
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Image Processing (AREA)
- Radiation Pyrometers (AREA)
Abstract
The embodiment of the invention is suitable for the field of gray level sensors, provides a gray level sensor module, a data processing method thereof and a tracking robot, acquires a real-time gray level value of a detection object in the process of detecting the detection object, judges the depth of the real-time gray level value according to the difference value between the real-time gray level value and a first preset gray level value and a second preset gray level value, outputs an identification result of the real-time gray level value according to the depth of the real-time gray level value, can effectively identify colors with different depths in the detection object, is not influenced by the manufacturing material, the printing process and the surface height change of the detection object, has various use scenes, and can adapt to a complex environment.
Description
Technical Field
The invention belongs to the technical field of gray level sensors, and particularly relates to a gray level sensor module, a data processing method thereof and a tracking robot.
Background
The grayscale sensor is an analog sensor, detects the shade of color of a detection object by using the principle that the detection objects with different colors reflect light to different degrees, and is generally applied to a tracking robot to perform grayscale detection, black and white line distinguishing, color recognition and the like on the ground.
However, the existing grayscale sensor cannot effectively identify colors with different depths, the identification effect is easily affected by the manufacturing material, the printing process and the surface height change of the detection object, the use scene is single, and the grayscale sensor cannot adapt to a complex environment.
Disclosure of Invention
In view of this, embodiments of the present invention provide a grayscale sensor module, a data processing method thereof, and a tracking robot, so as to solve the problems that the existing grayscale sensor cannot effectively identify colors with different shades, the identification effect is easily affected by the manufacturing material, the printing process, and the surface height change of the detection object, the usage scene is single, and the existing grayscale sensor cannot adapt to a complex environment.
A first aspect of an embodiment of the present invention provides a data processing method, which is applied to a grayscale sensor module, and the data processing method includes:
in the process of detecting a detection object, acquiring a real-time gray value of the detection object;
acquiring a difference value between the real-time gray value and a first preset gray value and a second preset gray value;
judging the depth of the real-time gray value according to the difference value;
and outputting the identification result of the real-time gray value according to the depth of the real-time gray value.
A second aspect of the embodiments of the present invention provides a grayscale sensor module, including an infrared grayscale sensor, a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the data processing method when executing the computer program, and the computer program includes:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a real-time gray value of a detection object in the process of detecting the detection object;
the second acquisition module is used for acquiring a difference value between the real-time gray value and a first preset gray value as well as a second preset gray value;
the judging module is used for judging the depth of the real-time gray value according to the difference value;
and the output module is used for outputting the identification result of the real-time gray value according to the depth of the real-time gray value.
According to a third aspect of the embodiments of the present invention, a tracking robot is provided, including the above grayscale sensor module.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the data processing method described above
In the embodiment of the invention, the real-time gray value of the detection object is obtained in the process of detecting the detection object, the depth of the real-time gray value is judged according to the difference value between the real-time gray value and the first preset gray value as well as the second preset gray value, and the identification result of the real-time gray value is output according to the depth of the real-time gray value, so that the colors with different depths in the detection object can be effectively identified, the method is not influenced by the manufacturing material, the printing process and the surface height change of the detection object, the use scenes are various, and the method can adapt to the complex environment.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a data processing method according to an embodiment of the present invention;
FIG. 2 is a block diagram of a data processing system according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a grayscale sensor module according to a third embodiment of the invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some embodiments of the present invention, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "comprises" and "comprising," and any variations thereof, in the description and claims of this invention and the above-described drawings are intended to cover non-exclusive inclusions. For example, a process, method, or system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus. Furthermore, the terms "first" and "second," etc. are used to distinguish between different objects and are not used to describe a particular order.
Example one
The embodiment provides a data processing method, which is applied to a gray sensor module of a tracking robot, wherein the gray sensor module may include a gray sensor, an analog-to-digital converter, a processor, a memory and the like, and the data processing method may be a software program method executed by the processor.
In a specific application, the grayscale sensor is specifically an infrared grayscale sensor, and the analog-to-digital converter and the memory can be internal functional modules of the processor.
As shown in fig. 1, the data processing method provided in this embodiment includes:
step S101, in the process of detecting a detection object, acquiring a real-time gray value of the detection object.
In a specific application, the detection object may be any physical object capable of reflecting light so as to detect a gray value, for example, a map, a drawing, a picture, etc. made of a ground surface and made of any material and in a printing manner. The real-time gray value refers to the gray value acquired by the gray sensor module at any height position from the detection object in real time. When the detection is started, the height of the gray scale sensor module from the initial detection position of the detection object is the determined initial height, and in the detection process, the height of the gray scale sensor module from the current detection position of the detection object changes along with the change of the surface texture, the wrinkle or the terrain of the detection object. The initial height may be set according to actual needs, for example, the initial height may be set to a height as close as possible to the surface of the detection object, that is, the ideal value of the initial height is 0.
In one embodiment, step S101 includes:
and in the process of detecting the detection object, acquiring the real-time gray value of the detection object through the infrared gray sensor.
In one embodiment, in the process of detecting a detection object, acquiring a real-time gray scale value of the detection object by the infrared gray scale sensor includes:
in the process of detecting a detection object, acquiring an analog signal of a gray value of the detection object through the infrared gray sensor;
converting the analog signal of the gray value of the detection object into a digital signal through the analog-to-digital converter to obtain a real-time gray value of the detection object;
and storing the real-time gray value through the memory.
And S102, acquiring a difference value between the real-time gray value and a first preset gray value and a second preset gray value.
In a specific application, the first preset gray value and the second preset gray value are gray values of two specific objects with different colors, which are obtained by the gray sensor module at a preset height in advance. The preset height may be set to any height according to actual needs, or may be a height as close as possible to a specific object, that is, the ideal value of the preset height is 0. The two specific objects with different colors may be any same or different objects, and the two different colors are two colors with different gray values and can be distinguished by the gray sensor module, such as black and white.
In one embodiment, before step S102, the method includes:
and acquiring gray values of two specific objects with different colors to obtain the first preset gray value and the second preset gray value.
In one embodiment, obtaining the gray values of the specific objects of two different colors to obtain the first preset gray value and the second preset gray value includes:
and acquiring gray values of the two objects with different colors at preset height positions away from the two objects with different colors by using an infrared gray sensor to obtain the first preset gray value and the second preset gray value.
In a specific application, the difference value is a smaller one of an absolute value of a difference between the real-time gray-scale value and the first preset gray-scale value and an absolute value of a difference between the real-time gray-scale value and the second preset gray-scale value.
In one embodiment, step S102 includes:
and acquiring gray values of two specific objects with different colors through an infrared gray sensor to obtain the first preset gray value and the second preset gray value.
And acquiring a difference value between the real-time gray value and a first preset gray value and a second preset gray value through an Adaptive algorithm (Adaptive algorithm).
In one embodiment, the calculation formula for obtaining the difference value between the real-time gray value and the first preset gray value and the second preset gray value through the adaptive algorithm is as follows:
Difference=min(abs(readData-recordData1),abs(readData-recordData2));
wherein, Difference represents Difference value, min (x, y) function represents Difference value which is the smaller of random variable x and y, abs () function is absolute value function for calculating absolute value, readData represents real-time gray value, recordsata 1 represents first preset gray value, recordsata 2 represents second preset gray value.
In a specific application, abs (readData-recordData1) represents an absolute value of readData-recordData1, abs (readData-recordData2) represents an absolute value of readData-recordData2, and Difference min (abs (readData-recordData1), abs (readData-recordData2)) represents the smaller of Difference abs (readData-recordData1) and abs (readData-recordData 2).
In one embodiment, step S102 includes:
and running a self-adaptive algorithm program stored in the memory through the processor to acquire a difference value between the real-time gray value and a first preset gray value and a second preset gray value.
And S103, judging the depth of the real-time gray value according to the difference value.
In a specific application, the manufacturing material, the printing manner, and the surface wrinkle of the detection object all affect the identification result of the gray value of the same color on the detection object, and therefore, the gray value of the same color needs to be distinguished according to the difference value. For example, the shades of the gradation values of the black detection object printed with single color black and four color black may be distinguished according to the difference values. The depth is the size of the gray scale value, and the depth of the gray scale value indicates that the gray scale value is large, otherwise, the gray scale value is small.
In one embodiment, step S103 includes:
comparing the difference value with a preset difference threshold value;
when the difference value is smaller than or equal to the preset difference threshold value, judging that the real-time gray value is a dark gray value;
and when the difference value is larger than the preset difference threshold value, judging that the real-time gray value is a light-color gray value.
In a specific application, the preset difference threshold may be set according to the characteristics of the grayscale sensor. For example, a preset difference threshold corresponding to the infrared gray sensor may be set to 16.
In one embodiment, when the real-time gray-scale value is a black gray-scale value, the dark gray-scale value is a monochrome black gray-scale value, and the light gray-scale value is a four-color black gray-scale value.
In a specific application, the single-color black refers to pure black printed by C0, M0, Y0 and K100 in a CMYK mode, and the four-color black refers to black printed by superposing four colors of C100, M100, Y100 and K100 in the CMYK mode; where C denotes Cyan, M denotes Magenta, Y denotes Yellow, and K denotes blacK, and the gradation value of monochrome blacK is larger than that of four-color blacK.
And S104, outputting the identification result of the real-time gray value according to the depth of the real-time gray value.
In specific application, the identification result of the real-time gray value is set to be a gray value which is larger than the real-time gray value when the real-time gray value is deep, and the identification result of the real-time gray value is set to be a gray value which is smaller than the real-time gray value when the real-time gray value is shallow; or, the identification result of the real-time gray value is set as the real-time gray value when the real-time gray value is deep, and the identification result of the real-time gray value is set as the gray value which is smaller than the real-time gray value when the real-time gray value is shallow; or, the identification result of the real-time gray value is set to be a gray value larger than the real-time gray value when the real-time gray value is deep, and the identification result of the real-time gray value is set to be a real-time gray value when the real-time gray value is shallow. It is sufficient if the gray values of the detection objects displayed in the same color can be distinguished from each other by the difference in material or printing method.
In a specific application, when the grayscale sensor module is applied to a tracking robot, a host or a main processor of the tracking robot tracks a detection object according to a real-time grayscale value output by the grayscale sensor module.
In the embodiment, the real-time gray value of the detection object is acquired in the process of detecting the detection object, the depth of the real-time gray value is judged according to the difference value between the real-time gray value and the first preset gray value and the second preset gray value, the identification result of the real-time gray value is output according to the depth of the real-time gray value, colors with different depths in the detection object can be effectively identified, the influence of the manufacturing material, the printing process and the surface height change of the detection object is avoided, the use scenes are various, and the method can adapt to the complex environment.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example two
The present embodiment provides a data processing system for performing the method steps of the first embodiment, which may be a software program system stored in the memory of the gray sensor module for execution by the processor.
As shown in fig. 2, the data processing system 2 provided in the present embodiment includes:
a first obtaining module 201, configured to obtain a real-time gray value of a detection object in a process of detecting the detection object;
a second obtaining module 202, configured to obtain a difference value between the real-time gray value and the first preset gray value and the second preset gray value;
the judging module 203 is configured to judge the depth of the real-time gray value according to the difference value;
and the output module 204 is configured to output the identification result of the real-time gray value according to the depth of the real-time gray value.
In a specific application, each module in the data processing system may be a software program module stored in a memory of the grayscale sensor module and executed by a processor, or may be a hardware functional module in the processor, and each module may be implemented by an independent processor, or may be integrated together into one processor.
In one embodiment, the first obtaining module is specifically configured to:
and in the process of detecting the detection object, acquiring the real-time gray value of the detection object through the infrared gray sensor.
In one embodiment, the first obtaining module is specifically configured to:
in the process of detecting a detection object, acquiring an analog signal of a gray value of the detection object through the infrared gray sensor;
converting the analog signal of the gray value of the detection object into a digital signal through the analog-to-digital converter to obtain a real-time gray value of the detection object;
and storing the real-time gray value through the memory.
In one embodiment, the data processing system further comprises:
and the third acquisition module is used for acquiring the gray values of the specific objects with two different colors to obtain the first preset gray value and the second preset gray value.
In one embodiment, the third obtaining module is specifically configured to:
and acquiring gray values of the two objects with different colors at preset height positions away from the two objects with different colors to obtain the first preset gray value and the second preset gray value.
In one embodiment, the third obtaining module is specifically configured to:
and acquiring gray values of two specific objects with different colors through an infrared gray sensor to obtain the first preset gray value and the second preset gray value.
In one embodiment, the second obtaining module is specifically configured to:
and acquiring a difference value between the real-time gray value and a first preset gray value and a second preset gray value through a self-adaptive algorithm.
In one embodiment, the second obtaining module is specifically configured to:
and running a self-adaptive algorithm program stored in the memory through the processor to acquire a difference value between the real-time gray value and a first preset gray value and a second preset gray value.
In one embodiment, the determining module is specifically configured to:
comparing the difference value with a preset difference threshold value;
when the difference value is smaller than or equal to the preset difference threshold value, judging that the real-time gray value is a dark gray value;
and when the difference value is larger than the preset difference threshold value, judging that the real-time gray value is a light-color gray value.
In the embodiment, the real-time gray value of the detection object is acquired in the process of detecting the detection object, the depth of the real-time gray value is judged according to the difference value between the real-time gray value and the first preset gray value and the second preset gray value, the identification result of the real-time gray value is output according to the depth of the real-time gray value, colors with different depths in the detection object can be effectively identified, the influence of the manufacturing material, the printing process and the surface height change of the detection object is avoided, the use scenes are various, and the method can adapt to the complex environment.
EXAMPLE III
As shown in fig. 3, the present embodiment provides a gray sensor module 3 including: an infrared gray-scale sensor 31, an analog-to-digital converter 32, a processor 33, a memory 34 and a computer program 35, such as a data processing program, stored in said memory 34 and executable on said processor 33. The processor 33 implements the steps in the above-described respective data processing method embodiments, such as the steps S101 to S103 shown in fig. 1, when executing the computer program 35. Alternatively, the processor 33 implements the functions of the modules in the device embodiments, such as the functions of the modules 201 to 203 shown in fig. 2, when executing the computer program 35.
As shown in fig. 3, the ir gray scale sensor 31 is communicatively coupled to the analog-to-digital converter 32, the analog-to-digital converter 32 is communicatively coupled to the processor 33, and the processor 33 is communicatively coupled to the memory 34. The analog-to-digital converter 32 and the memory 34 may also be an internal hardware structure or a software functional module of the processor 33, and in the present embodiment, the analog-to-digital converter 32 and the memory 34 are exemplarily shown to be a hardware structure independent of the processor 33.
Illustratively, the computer program 35 may be partitioned into one or more modules that are stored in the memory 34 and executed by the processor 33 to implement the present invention. The one or more modules may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 35 in the grayscale sensor module 3. For example, the computer program 35 may be divided into a first acquiring module, a second acquiring module, a judging module and an outputting module, and each module has the following specific functions:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a real-time gray value of a detection object in the process of detecting the detection object;
the second acquisition module is used for acquiring a difference value between the real-time gray value and a first preset gray value as well as a second preset gray value;
the judging module is used for judging the depth of the real-time gray value according to the difference value;
and the output module is used for outputting the identification result of the real-time gray value according to the depth of the real-time gray value.
The grayscale sensor module may include, but is not limited to, an infrared grayscale sensor 31, a processor 33, and a memory 34. It will be understood by those skilled in the art that fig. 3 is merely an example of the gray sensor module 3, and does not constitute a limitation of the gray sensor module 3, and may include more or less components than those shown, or combine some components, or different components, for example, the gray sensor module may further include an input-output device, a network access device, a bus, etc.
The Processor 33 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 34 may be an internal storage unit of the grayscale sensor module 3, such as a hard disk or a memory of the grayscale sensor module 3. The memory 34 may also be an external storage device of the grayscale sensor module 3, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like provided on the grayscale sensor module 3. Further, the memory 34 may also include both an internal storage unit and an external storage device of the gray sensor module 3. The memory 34 is used to store the computer program and other programs and data required by the grayscale sensor module. The memory 34 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed gray scale sensor module and method can be implemented in other ways. For example, the above-described grayscale sensor module embodiment is merely illustrative, and for example, the division of the module or the unit is only a logical division, and there may be other divisions when the actual implementation is performed, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated module, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.
Claims (8)
1. A data processing method is characterized in that the data processing method is applied to a gray sensor module of a tracking robot, and comprises the following steps:
in the process of detecting a detection object, acquiring a real-time gray value of the detection object;
acquiring a difference value between the real-time gray value and a first preset gray value and a second preset gray value through a self-adaptive algorithm;
judging the depth of the real-time gray value according to the difference value; wherein, the difference value is used for distinguishing the depth of the gray value of the same color;
outputting the identification result of the real-time gray value according to the depth of the real-time gray value;
the calculation formula for obtaining the difference value between the real-time gray value and the first preset gray value and the second preset gray value through the self-adaptive algorithm is as follows:
Difference=min(abs(readData-recordData1),abs(readData-recordData2));
wherein, Difference represents Difference value, min (x, y) function represents Difference value which is the smaller of random variable x and y, abs () function is absolute value function for calculating absolute value, readData represents real-time gray value, recordsata 1 represents first preset gray value, recordsata 2 represents second preset gray value.
2. The data processing method of claim 1, wherein determining the depth of the real-time gray-level value according to the difference value comprises:
comparing the difference value with a preset difference threshold value;
when the difference value is smaller than or equal to the preset difference threshold value, judging that the real-time gray value is a dark gray value;
and when the difference value is larger than the preset difference threshold value, judging that the real-time gray value is a light-color gray value.
3. The data processing method of claim 2, wherein when the real-time gray value is a black gray value, the dark gray value is a single color black gray value, and the light gray value is a four color black gray value.
4. The data processing method of claim 1, wherein before acquiring the real-time gray-scale value of the detection object during the detection of the detection object, the method comprises:
and acquiring gray values of two specific objects with different colors to obtain the first preset gray value and the second preset gray value.
5. The data processing method of claim 1, wherein obtaining gray values of two different colors of the specific object to obtain the first preset gray value and the second preset gray value comprises:
and acquiring gray values of the two objects with different colors at preset height positions away from the two objects with different colors to obtain the first preset gray value and the second preset gray value.
6. The data processing method of claim 1, wherein the grayscale sensor module includes an infrared grayscale sensor;
in the process of detecting a detection object, acquiring a real-time gray value of the detection object, including:
and in the process of detecting the detection object, acquiring the real-time gray value of the detection object through the infrared gray sensor.
7. A gamma sensor module for a tracking robot, comprising an infrared gamma sensor, an analog-to-digital converter, a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the data processing method according to any one of claims 1 to 6, the computer program comprising:
the device comprises a first acquisition module, a second acquisition module and a third acquisition module, wherein the first acquisition module is used for acquiring a real-time gray value of a detection object in the process of detecting the detection object;
the second acquisition module is used for acquiring a difference value between the real-time gray value and the first preset gray value as well as the second preset gray value through a self-adaptive algorithm;
the judging module is used for judging the depth of the real-time gray value according to the difference value; wherein, the difference value is used for distinguishing the depth of the gray value of the same color;
the output module is used for outputting the identification result of the real-time gray value according to the depth of the real-time gray value;
the calculation formula for obtaining the difference value between the real-time gray value and the first preset gray value and the second preset gray value through the self-adaptive algorithm is as follows:
Difference=min(abs(readData-recordData1),abs(readData-recordData2));
wherein, Difference represents Difference value, min (x, y) function represents Difference value which is the smaller of random variable x and y, abs () function is absolute value function for calculating absolute value, readData represents real-time gray value, recordsata 1 represents first preset gray value, recordsata 2 represents second preset gray value.
8. A tracking robot comprising the gray sensor module of claim 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910542122.6A CN110390698B (en) | 2019-06-21 | 2019-06-21 | Gray scale sensor module, data processing method thereof and robot |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910542122.6A CN110390698B (en) | 2019-06-21 | 2019-06-21 | Gray scale sensor module, data processing method thereof and robot |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110390698A CN110390698A (en) | 2019-10-29 |
CN110390698B true CN110390698B (en) | 2021-09-17 |
Family
ID=68285643
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910542122.6A Active CN110390698B (en) | 2019-06-21 | 2019-06-21 | Gray scale sensor module, data processing method thereof and robot |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110390698B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112697167A (en) * | 2020-11-23 | 2021-04-23 | 深圳市越疆科技有限公司 | Threshold adjusting method of infrared tracking sensor and electronic equipment |
CN113183608B (en) * | 2021-04-23 | 2022-04-22 | 广州诚鼎机器人有限公司 | Slurry supplementing equipment and elliptical printing machine |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102568027A (en) * | 2011-12-28 | 2012-07-11 | 浙江工业大学 | Pixelate virtual tree illumination influenced area obtaining method |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102890780B (en) * | 2011-07-19 | 2015-07-22 | 富士通株式会社 | Image processing device and image processing method |
CN104766089A (en) * | 2014-01-08 | 2015-07-08 | 富士通株式会社 | Method and device for detecting Zebra crossing in image and electronic equipment |
CN103760903A (en) * | 2014-01-20 | 2014-04-30 | 昆山鑫盛盟创科技有限公司 | Intelligent tracking conveying management system for warehouses |
CN104932507B (en) * | 2015-06-09 | 2017-11-07 | 北京联合大学 | A kind of night patrol machine people automatic tracking method |
CN105700531B (en) * | 2016-04-18 | 2019-03-01 | 南京工程学院 | Two layers of work sweeping robot of household based on customized map and its method of sweeping the floor |
US10032276B1 (en) * | 2016-08-29 | 2018-07-24 | PerceptIn, Inc. | Visual-inertial positional awareness for autonomous and non-autonomous device |
US10444761B2 (en) * | 2017-06-14 | 2019-10-15 | Trifo, Inc. | Monocular modes for autonomous platform guidance systems with auxiliary sensors |
CN107256667A (en) * | 2017-07-28 | 2017-10-17 | 火星人视野(北京)教育科技有限公司 | A kind of tracking cart for teaching demonstration |
CN208126200U (en) * | 2018-05-08 | 2018-11-20 | 深圳市优必选科技有限公司 | Intelligent terminal and grey scale tracking sensing module and grey scale sensor thereof |
-
2019
- 2019-06-21 CN CN201910542122.6A patent/CN110390698B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102568027A (en) * | 2011-12-28 | 2012-07-11 | 浙江工业大学 | Pixelate virtual tree illumination influenced area obtaining method |
Also Published As
Publication number | Publication date |
---|---|
CN110390698A (en) | 2019-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8055066B2 (en) | Apparatus, system, and method for skin tone detection in a CMOS image sensor | |
CN104052979B (en) | For device and the technology of image processing | |
Banić et al. | Improving the white patch method by subsampling | |
US20140247979A1 (en) | Method and device for generating high dynamic range images | |
JP5815878B2 (en) | Print defect detection | |
CN110390698B (en) | Gray scale sensor module, data processing method thereof and robot | |
CN109495731B (en) | Method for automatic white balance executed by image signal processor | |
WO2006134923A1 (en) | Image processing device, computer program product, and image processing method | |
CN106815587B (en) | Image processing method and device | |
CN113316711B (en) | Method and apparatus for estimating ambient light | |
US8995730B2 (en) | Image processing apparatus for analyzing and enhancing fingerprint images | |
KR101116682B1 (en) | Image forming apparatus and control method thereof | |
CN111626967A (en) | Image enhancement method, image enhancement device, computer device and readable storage medium | |
CN110070080A (en) | A kind of character detecting method and device, equipment and computer readable storage medium | |
CN108805838A (en) | A kind of image processing method, mobile terminal and computer readable storage medium | |
CN107343188A (en) | image processing method, device and terminal | |
CN116310524A (en) | Classification component information processing application system | |
CN107424134B (en) | Image processing method, image processing device, computer-readable storage medium and computer equipment | |
KR20150107581A (en) | Image processing apparatus and image processing method | |
US20150055858A1 (en) | Systems and methods for color recognition in computer vision systems | |
CN113658141A (en) | Transparent packaging bag sealing identification method and device, storage medium and electronic equipment | |
Tran et al. | An adaptive method for lane marking detection based on HSI color model | |
CN110570347B (en) | Color image graying method for lane line detection | |
Shams-Nateri et al. | Computer vision techniques for measuring and demonstrating color of textile | |
CN112699760A (en) | Face target area detection method, device and equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |