CN117474926B - Image detection method and device - Google Patents
Image detection method and device Download PDFInfo
- Publication number
- CN117474926B CN117474926B CN202311826441.2A CN202311826441A CN117474926B CN 117474926 B CN117474926 B CN 117474926B CN 202311826441 A CN202311826441 A CN 202311826441A CN 117474926 B CN117474926 B CN 117474926B
- Authority
- CN
- China
- Prior art keywords
- image
- information
- electronic device
- flow
- photographing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 claims abstract description 190
- 238000012545 processing Methods 0.000 claims abstract description 144
- 230000008447 perception Effects 0.000 claims abstract description 61
- 230000005856 abnormality Effects 0.000 claims abstract description 44
- 230000008569 process Effects 0.000 claims description 111
- 230000002159 abnormal effect Effects 0.000 claims description 67
- 238000004891 communication Methods 0.000 claims description 18
- 238000004590 computer program Methods 0.000 claims description 15
- 238000003860 storage Methods 0.000 claims description 15
- 238000007906 compression Methods 0.000 claims description 14
- 230000006835 compression Effects 0.000 claims description 12
- 230000006872 improvement Effects 0.000 claims description 12
- 238000012937 correction Methods 0.000 claims description 11
- 230000009467 reduction Effects 0.000 claims description 11
- 238000006243 chemical reaction Methods 0.000 claims description 9
- 230000006837 decompression Effects 0.000 claims description 9
- 238000003825 pressing Methods 0.000 claims description 4
- 230000006870 function Effects 0.000 description 17
- 238000010586 diagram Methods 0.000 description 9
- 238000007726 management method Methods 0.000 description 9
- 238000013500 data storage Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000004088 simulation Methods 0.000 description 6
- 230000009286 beneficial effect Effects 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 238000009826 distribution Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 241000153282 Theope Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000013441 quality evaluation Methods 0.000 description 3
- 230000032683 aging Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 230000003190 augmentative effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 239000000835 fiber Substances 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 238000011946 reduction process Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000013475 authorization Methods 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 210000000988 bone and bone Anatomy 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 239000003795 chemical substances by application Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000005111 flow chemistry technique Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000001303 quality assessment method Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The embodiment of the application provides an image detection method and device, and relates to the technical field of terminals. The method comprises the following steps: the first electronic equipment responds to photographing operation, and caches first information, wherein the first information comprises images acquired by a camera, metadata information of the images and scene perception information when the images are photographed; the image and metadata information are used for simulating an original image when the image is shot, and the scene perception information is used for simulating an environment when the image is shot; when the flow of processing the image is wrong, the first information is stored locally to the first electronic device. In this way, scene reproduction of the photographing operation of the first electronic device can be realized based on the first information stored locally in the first electronic device, and then an abnormality occurring in the photographing operation can be located in the scene Jing Fu.
Description
Technical Field
The present application relates to the field of terminal technologies, and in particular, to an image detection method and apparatus.
Background
With the development of electronic devices, some electronic devices include a camera, and the electronic devices can support functions such as photographing. When the current electronic device obtains a photographed image, some algorithm processing, such as loading some effect parameters, is performed.
However, in some electronic devices, abnormality in the photographed image may occur, but the cause of the abnormality cannot be determined.
Disclosure of Invention
The embodiment of the application provides an image detection method and device, which are used for providing information for positioning abnormal problems of images.
In a first aspect, an embodiment of the present application provides an image detection method. The method comprises the following steps: responding to photographing operation, and caching first information, wherein the first information comprises images acquired by a camera, metadata information of the images and scene perception information when the images are photographed; the image and metadata information are used for simulating an original image when the image is shot, and the scene perception information is used for simulating an environment when the image is shot; when the flow of processing the image is wrong, the first information is stored locally to the first electronic device. In this way, scene reproduction of the photographing operation of the first electronic device can be realized based on the first information stored locally in the first electronic device, and then an abnormality occurring in the photographing operation can be located in the scene Jing Fu.
In one possible implementation, the scene perception information includes one or more of the following: decision information for deciding an algorithm mode used for photographing, information collected by a sensor for anti-shake processing, information collected by a sensor for brightness sensing, information collected by a sensor for temperature sensing, or information collected by a sensor for distance sensing. It can be understood that the information is information which may affect the image quality in the shooting scene, and the information is saved, so that the scene of the abnormal image is restored.
In one possible implementation, the first information includes decision information for deciding a mode of an algorithm used for photographing. It will be appreciated that such decision information may affect the quality of the image, and that saving such information is advantageous in restoring the scene of the outlier. In one possible implementation, the process of processing the image includes one or more of the following: decompression flow, noise reduction flow, sharpness improvement flow, highlight compression flow, face region processing flow, blurring flow, demosaicing flow, dead point removal processing flow, white balance processing flow, automatic exposure AC control flow, automatic focusing AF control flow, sharpening processing flow, color correction flow, color enhancement flow or image compression flow. It can be understood that these processing flows are flows that are prone to error in image processing, so that whether the images output by these flows are abnormal is detected, which is beneficial to accurately obtaining the reason why the images are abnormal later.
In one possible implementation, before storing the first information locally on the first electronic device, the method further includes: after the first flow is executed, abnormality detection is carried out on the processed image obtained after the first flow is executed; when an abnormality is detected in the processed image, a first flow Cheng Chucuo is indicated, and the first flow is any one of the flows of processing the image. In this way, the cause of the occurrence of the image abnormality is facilitated to be determined later.
In one possible implementation, the first procedure includes one or more of the following: noise reduction flow, sharpness improvement algorithm flow, highlight pressing flow or face region processing flow. It can be understood that these processing flows are flows with high error frequency in practice, and the image quality evaluation IQA module is set for these processing flows, so that the abnormal image can be more accurately determined through a small amount of detection.
In one possible implementation manner, a dump switch is arranged in the first electronic device, the first electronic device allows the data in the cache to be stored to the local of the first electronic device when the dump switch is in an on state, and the first electronic device does not allow the data in the cache to be stored to the local of the first electronic device when the dump switch is in an off state; storing the first information locally at the first electronic device, including: the first information is stored locally on the first electronic device with the dump switch in an open state. Therefore, a user can control the dump switch, when the dump switch is in an on state, data in the first electronic equipment cache is stored locally, and correspondingly, when the dump switch is in an off state, the data in the first electronic equipment cache can not be stored locally, so that the user requirement can be adapted, and the user privacy can be better protected.
In one possible implementation, after storing the first information locally on the first electronic device, the method further includes: obtaining a photographed image; in the case where an operation for uploading the photographed image to the server is received, both the first information and the photographed image are uploaded to the server. In this way, the device for scene reproduction can download the first information and the photographed image of the first electronic device through the server, and when the user using the first electronic device wants to solve the problem of image abnormality, the user can operate on line by using the first electronic device, and communication connection between the first electronic device and the scene reproduction device is not required to be established. Furthermore, the server can collect abnormal images and related information of a plurality of users to analyze big data, which is beneficial to optimizing photographing software.
In one possible implementation, after storing the first information locally on the first electronic device, the method further includes: simulating photographing processes by using the first information to obtain processing results of the photographing processes; in the simulated photographing process, a photographing strategy is determined based on scene perception information in the first information, and an image acquired by a camera is replaced by an original image obtained based on the image in the first information, the scene perception information and metadata information. It can be understood that the second electronic device may determine a photographing policy required for reproducing the abnormal image using the first information, and restore scene information when photographing the abnormal image. Thus, the reason for the occurrence of the image abnormality can be determined more accurately.
In one possible implementation, simulating the photographing procedure using the first information is performed by the first electronic device or the second electronic device. It can be appreciated that the first electronic device may be an electronic device that captures an abnormal image, and the photographing procedure may be simulated by using the first electronic device. And under the condition that the first electronic equipment cannot simulate the photographing flow, the photographing flow can be simulated by the second electronic equipment. Therefore, the photographing process can be flexibly simulated by using the first electronic device or the second electronic device according to the actual available condition of the first electronic device or the second electronic device.
In one possible implementation, the simulating the photographing procedure using the first information is performed by the second electronic device, and before the simulating the photographing procedure using the first information, the simulating includes: converting the first information into a form which can be matched with the version of the second electronic equipment; and importing the first information after the conversion into the second electronic equipment. It will be appreciated that the first information is obtained by the first electronic device and that simulating the photographing procedure using the first information is performed by the second electronic device. The first electronic device and the second electronic device may be different models of electronic devices, and thus, it is necessary to convert the first information into a form that the version of the second electronic device can match. Therefore, the second electronic equipment can simulate the photographing flow by using the converted first information, and the image abnormality is reproduced.
In one possible implementation manner, after simulating the photographing processes by using the first information to obtain the processing result of each photographing process, the method further includes: modifying a flow of simulating a photographing flow by using the first information; and (3) adopting the first information and the modified simulated photographing flow to carry out simulated photographing again until the result of simulated photographing is abnormal. It can be understood that the second electronic device can simulate the photographing process by using the first information to locate the reason for the abnormality of the image, and on the basis, the second electronic device can modify the abnormality of the image to obtain a normal image. In this way, the efficiency of processing image anomalies is improved.
In a second aspect, an embodiment of the present application provides an image detection apparatus, where the image detection apparatus may be an electronic device, or may be a chip or a chip system in the electronic device. The image detection apparatus may include a display unit and a processing unit. When the image detection apparatus is an electronic device, the display unit may be a display screen. The display unit is configured to perform the step of displaying to enable the electronic device to implement an image detection method as described in the first aspect or any one of the possible implementations of the first aspect. When the image detection apparatus is an electronic device, the processing unit may be a processor. The image detection apparatus may further include a storage unit, which may be a memory. The storage unit is configured to store instructions, and the processing unit executes the instructions stored in the storage unit, so that the electronic device implements an image detection method described in the first aspect or any one of possible implementation manners of the first aspect. When the image detection means is a chip or a system of chips within an electronic device, the processing unit may be a processor. The processing unit executes the instructions stored by the storage unit to cause the electronic device to implement an image detection method as described in the first aspect or any one of the possible implementations of the first aspect. The memory unit may be a memory unit (e.g., a register, a cache, etc.) within the chip, or a memory unit (e.g., a read-only memory, a random access memory, etc.) within the electronic device that is external to the chip.
The processing unit is used for responding to photographing operation, and buffering first information, wherein the first information comprises an image acquired by a camera, metadata information of the image and scene perception information when the image is photographed; the image and metadata information are used for simulating an original image when the image is shot, and the scene perception information is used for simulating an environment when the image is shot; and when the process of processing the image is wrong, the first information is stored locally to the first electronic equipment.
In one possible implementation, the scene perception information includes one or more of the following: decision information for deciding an algorithm mode used for photographing, information collected by a sensor for anti-shake processing, information collected by a sensor for brightness sensing, information collected by a sensor for temperature sensing, or information collected by a sensor for distance sensing.
In one possible implementation, the first information includes decision information for deciding a mode of an algorithm used for photographing.
In one possible implementation, the process of processing the image includes one or more of the following: decompression flow, noise reduction flow, sharpness improvement flow, highlight compression flow, face region processing flow, blurring flow, demosaicing flow, dead point removal processing flow, white balance processing flow, automatic exposure AC control flow, automatic focusing AF control flow, sharpening processing flow, color correction flow, color enhancement flow or image compression flow.
In one possible implementation, before storing the first information locally on the first electronic device, the method further includes: the processing unit is used for carrying out abnormality detection on the processed image obtained after the first process is executed; when an abnormality is detected in the processed image, a first flow Cheng Chucuo is indicated, and the first flow is any one of the flows of processing the image.
In one possible implementation, the first procedure includes one or more of the following: noise reduction flow, sharpness improvement algorithm flow, highlight pressing flow or face region processing flow.
In one possible implementation manner, a dump switch is arranged in the first electronic device, the first electronic device allows the data in the cache to be stored to the local of the first electronic device when the dump switch is in an on state, and the first electronic device does not allow the data in the cache to be stored to the local of the first electronic device when the dump switch is in an off state; storing the first information locally at the first electronic device, including: and the processing unit is used for storing the first information locally to the first electronic equipment under the condition that the dump switch is in an open state.
In one possible implementation, after storing the first information locally on the first electronic device, the method further includes: the processing unit is used for obtaining a photographed image; for uploading both the first information and the photographed image to the server upon receiving an operation for uploading the photographed image to the server.
In one possible implementation, after storing the first information locally on the first electronic device, the method further includes: the processing unit is used for simulating photographing processes by using the first information to obtain processing results of the photographing processes; in the simulated photographing process, a photographing strategy is determined based on scene perception information in the first information, and an image acquired by a camera is replaced by an original image obtained based on the image in the first information, the scene perception information and metadata information.
In one possible implementation, simulating the photographing procedure using the first information is performed by the first electronic device or the second electronic device.
In one possible implementation, the simulating the photographing procedure using the first information is performed by the second electronic device, and before the simulating the photographing procedure using the first information, the simulating includes: the processing unit is used for converting the first information into a form which can be matched with the version of the second electronic equipment; and the method is used for importing the first information after the conversion into the second electronic equipment.
In one possible implementation manner, after simulating the photographing processes by using the first information to obtain the processing result of each photographing process, the method further includes: the processing unit is used for modifying the flow of the simulated photographing flow by using the first information; and the simulation photographing process is used for adopting the first information and the modified simulation photographing process, and the simulation photographing process is carried out again until the simulation photographing result is abnormal.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor and a memory, the memory being configured to store code instructions, the processor being configured to execute the code instructions to perform the method described in the first aspect or any one of the possible implementations of the first aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein a computer program or instructions which, when run on a computer, cause the computer to perform the method described in the first aspect or any one of the possible implementations of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method described in the first aspect or any one of the possible implementations of the first aspect.
In a sixth aspect, the present application provides a chip or chip system comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by wires, the at least one processor being adapted to execute a computer program or instructions to perform the method described in the first aspect or any one of the possible implementations of the first aspect. The communication interface in the chip can be an input/output interface, a pin, a circuit or the like.
In one possible implementation, the chip or chip system described above further includes at least one memory, where the at least one memory has instructions stored therein. The memory may be a memory unit within the chip, such as a register, a cache, etc., or may be a memory unit of the chip (e.g., a read-only memory, a random access memory, etc.).
It should be understood that, the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the advantages obtained by each aspect and the corresponding possible embodiments are similar, and are not repeated.
Drawings
Fig. 1 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 2 is a schematic software structure of an electronic device according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of an image capturing data storage process according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of an image anomaly reproduction process according to an embodiment of the present application;
FIG. 5 is a schematic flow chart of an image detection method according to an embodiment of the present application;
Fig. 6 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In order to facilitate the clear description of the technical solutions of the embodiments of the present application, the following simply describes some terms and techniques involved in the embodiments of the present application:
1. RAW image
It is understood that the RAW image is also understood as the original image, as is the unprocessed image acquired by the camera device.
2. YUV image
It is understood that an image processed by a color coding method, in which Y represents luminance (i.e., gray value), U represents chrominance (chrominance) for describing the color of the image, and V represents density (chroma) for describing the saturation of the image.
3. Image signal processor (IMAGE SIGNAL processor, ISP)
ISPs may be used to process images, where image processing may include image rasterization, black level compensation, lens correction, bad pixel correction, color difference, noise removal, automatic white balancing, color correction, color control conversion, downsampling processing, image format conversion, and so forth. In some implementations, the processing of an image by an ISP may also be referred to as ISP image processing. For example, ISP image processing may convert RAW images into YUV images.
4. Upper computer
The upper computer refers to a computer or a singlechip which directly sends an operation instruction, and provides a user operation interaction interface to display feedback data to a user, such as a computer, a mobile phone or a tablet.
For example, in the embodiment of the present application, the upper computer may be used to replace or convert control electronic devices and the like.
5. Other terms
In embodiments of the present application, the words "first," "second," and the like are used to distinguish between identical or similar items that have substantially the same function and effect. For example, the first chip and the second chip are merely for distinguishing different chips, and the order of the different chips is not limited. It will be appreciated by those of skill in the art that the words "first," "second," and the like do not limit the amount and order of execution, and that the words "first," "second," and the like do not necessarily differ.
It should be noted that, in the embodiments of the present application, words such as "exemplary" or "such as" are used to mean serving as an example, instance, or illustration. Any embodiment or design described herein as "exemplary" or "for example" should not be construed as preferred or advantageous over other embodiments or designs. Rather, the use of words such as "exemplary" or "such as" is intended to present related concepts in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a alone, a and B together, and B alone, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
6. Electronic equipment
The electronic device of the embodiment of the application can comprise a handheld device with an image detection function, a vehicle-mounted device and the like. For example, some electronic devices are: a mobile phone, a tablet, a palmtop, a notebook, a mobile internet device (mobile INTERNET DEVICE, MID), a wearable device, a Virtual Reality (VR) device, an augmented reality (augmented reality, AR) device, a wireless terminal in industrial control (industrial control), a wireless terminal in unmanned (SELF DRIVING), a wireless terminal in teleoperation (remote medical surgery), a wireless terminal in smart grid (SMART GRID), a wireless terminal in transportation security (transportation safety), a wireless terminal in smart city (SMART CITY), a wireless terminal in smart home (smart home), a cellular phone, a cordless phone, a session initiation protocol (session initiation protocol, SIP) phone, a wireless local loop (wireless local loop, WLL) station, a personal digital assistant (personal DIGITAL ASSISTANT, PDA), a handheld device with wireless communication functionality, a computing device or other processing device connected to a wireless modem, a vehicle-mounted device, a wearable device, a terminal device in a 5G network, or a future evolved land mobile network (public land mobile network), and the like, without limiting the application.
By way of example, and not limitation, in embodiments of the application, the electronic device may also be a wearable device. The wearable device can also be called as a wearable intelligent device, and is a generic name for intelligently designing daily wear by applying wearable technology and developing wearable devices, such as glasses, gloves, watches, clothes, shoes and the like. The wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also can realize a powerful function through software support, data interaction and cloud interaction. The generalized wearable intelligent device includes full functionality, large size, and may not rely on the smart phone to implement complete or partial functionality, such as: smart watches or smart glasses, etc., and focus on only certain types of application functions, and need to be used in combination with other devices, such as smart phones, for example, various smart bracelets, smart jewelry, etc. for physical sign monitoring.
In addition, in the embodiment of the application, the electronic equipment can also be terminal equipment in an internet of things (internet of things, ioT) system, and the IoT is an important component of the development of future information technology, and the main technical characteristics of the IoT are that the article is connected with a network through a communication technology, so that the man-machine interconnection and the intelligent network of the internet of things are realized.
The electronic device in the embodiment of the application may also be referred to as: a terminal device, a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, a user equipment, or the like.
In an embodiment of the present application, the electronic device or each network device includes a hardware layer, an operating system layer running on top of the hardware layer, and an application layer running on top of the operating system layer. The hardware layer includes hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and a memory (also referred to as a main memory). The operating system may be any one or more computer operating systems that implement business processes through processes (processes), such as a Linux operating system, a Unix operating system, an Android operating system, an iOS operating system, or a windows operating system. The application layer comprises applications such as a browser, an address book, word processing software, instant messaging software and the like.
With the development of electronic devices, some electronic devices include a camera, and the electronic devices can support functions such as photographing. When the current electronic device obtains a photographed image, some algorithm processing, such as loading some effect parameters, is performed.
However, in some electronic devices, abnormalities in the captured image may occur, and such abnormalities are reproducible with a low probability, difficult to reproduce after time and space changes, and may take a significant amount of time, for example, the user takes a high-speed rail, the image of the sunset is taken, but the user reflects the image abnormality to the after-market electronic device, and because time and space have changed, the scene in which the sunset was taken may not be reproduced after the sale, nor is the reason why the image of the sunset may appear.
This is because, when the electronic device captures an image, the original image captured by the camera is cached, but the cached data of the electronic device may be lost, for example, the old cached data may be replaced by the new cached data; or the buffer is released after the application exits, and the data in the buffer is destroyed. Therefore, when the user reflects the image abnormality to the after-sales, the after-sales data of the original image or the like acquired by the camera may not be taken, so that the image abnormality may not be reproduced.
In view of the above, the embodiment of the application provides an image detection method, after the electronic device starts photographing, the image collected by the camera, metadata information of the image and scene perception information when the image is photographed can be cached, and the image output by the image processing flow can be detected. When detecting that the image output by the image processing flow is abnormal, the electronic device can store the image acquired by the camera, the metadata information of the image and the scene perception information when the image is shot in the local area of the electronic device. In this way, the subsequent electronic device can implement scene reproduction of the photographing operation of the electronic device according to the locally stored data of the electronic device, and can locate the abnormality in the photographing operation in the field Jing Fu.
In order to better understand the embodiments of the present application, the structure of the electronic device according to the embodiments of the present application is described below.
By way of example, fig. 1 shows a schematic diagram of an electronic device.
The electronic device may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It should be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device. In other embodiments of the application, the electronic device may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may comprise hardware, software, or a combination of software and hardware implementations.
In the embodiment of the present application, the electronic device may implement the photographing function through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The electronic apparatus can realize sensing of the photographing environment through the temperature sensor 180J, the ambient light sensor 180L, the gyro sensor 180B, the acceleration sensor 180E, the distance sensor 180F, and the like.
Illustratively, the first electronic device may perform anti-shake sensing when the camera 193 captures an image based on the gyro sensor 180B and the acceleration sensor 180E, may perform temperature sensing when the camera 193 captures an image based on the temperature sensor 180J, may perform ambient light sensing when the camera 193 captures an image based on the ambient light sensor 180L, and may perform distance sensing when the camera 193 captures an image based on the distance sensor 180F. And the above scene perception information, metadata information and images acquired by the camera may be stored locally in the internal memory 121, so as to facilitate the subsequent reproduction of the scene when the image is captured.
Fig. 2 is a software configuration block diagram of an electronic device according to an embodiment of the present application. The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into five layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun rows (Android runtime) and system libraries, a hardware abstraction layer (hardware adaptation layer, HAL), and a kernel layer, respectively.
The application layer may also be referred to as an application layer, which may include a series of application packages. As shown in fig. 2, the application layer may include video, conferencing, camera, etc. applications. Applications may include system applications and three-way applications.
The application Framework layer may also be referred to as a Framework layer, which may provide an application programming interface (application programming interface, API) and programming Framework for applications of the application layer. The frame layer may include some predefined functions.
As shown in fig. 2, the Framework layer may include a window manager, a resource manager, a notification manager, a content provider, a view system, and the like.
Android runtime include core libraries and virtual machines. Android runtime is responsible for control and management of the android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the Framework layer run in virtual machines. The virtual machine executes java files of the application layer and the Framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may also be referred to as Native layer, which may include a plurality of functional modules. For example: media libraries, function libraries, graphics processing libraries, etc.
The HAL layer is an abstract layer structure between the kernel layer and Android runtime. The HAL layer may be a package for hardware drivers that provides a unified interface for the invocation of upper layer applications. In an embodiment of the present application, the HAL layer may include an image front-end processing module TFE, an image processing engine module, a perception engine module, a decision engine module, an image quality evaluation (image quality Assessment, IQA) module, and the like.
An image front end (TFE) module may also be referred to as the thin front end. The TFE module can be used for receiving the RAW image output by the image sensor, and can also perform downsampling, clipping and other processes on the RAW image and output the RAW image to the image processing engine module.
The image processing engine module may include: an offline processing engine (offline processing engine, OPE) module and an image processing engine (image processing engine, IPE) module. Wherein the IPE module is an online image processing engine. It will be appreciated that the OPE module is used in some electronic devices, the IPE module is used in other electronic devices, and the OPE module or IPE module is used in specific electronic devices, which is not limited by the embodiment of the present application.
The OPE module or IPE module may perform ISP image processing on the RAW image and output a YUV image. The image processing may include lens correction, noise removal, color correction, and other effect processing.
The perception engine module is used for perceiving scene perception information and the like; the decision engine module is used for deciding which camera is used for photographing and the like; the IQA module is used for detecting whether the image output by the image processing flow is abnormal or not.
The kernel layer is a layer between hardware and software. The kernel layer may include display drivers, camera drivers, audio drivers, battery drivers, sensor drivers, and the like.
In the embodiment of the application, the electronic equipment can process the image at the HAL layer, for example, the electronic equipment can sense the environmental information around the electronic equipment through the sensing engine module; the decision engine module can be used for deciding which camera to use for photographing; ISP image processing, such as decompression, noise reduction, definition improvement or high-light compression, can be performed on the RAW image in the IPE module, and YUV images are output; the image output by the image processing flow in the IPE module can be detected in the IQA module, such as noise reduction, sharpness improvement or high-light suppression.
It should be noted that, the embodiment of the present application is only illustrated by using an android system, and in other operating systems (such as a Windows system, an IOS system, etc.), the scheme of the present application can be implemented as long as the functions implemented by each functional module are similar to those implemented by the embodiment of the present application.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments.
The image detection method of the embodiment of the application can comprise two stages: the first stage is an image shooting data storage process; the second stage is the image anomaly reproduction process.
In the embodiment of the application, three electronic devices may be included: the system comprises a first electronic device, a second electronic device and an upper computer. The first electronic equipment is used for realizing the image shooting data storage process of the first stage; the second electronic equipment and the upper computer are used for realizing the process of reproducing the image abnormality in the second stage, the upper computer is used for triggering the second electronic equipment to photograph, and the second electronic equipment is used for reproducing the image abnormality.
The image detection method in the embodiment of the application is described by taking the example that the first electronic device, the second electronic device and the upper computer are mobile phones.
The image capture data storage process of the first phase is described below by way of a possible implementation.
Fig. 3 is a flowchart illustrating an image capturing data storage process according to an embodiment of the present application. The process of storing the image capturing data by the corresponding first electronic device specifically includes the following steps:
S301, photographing starts.
S302, scene perception.
In the embodiment of the application, after the photographing of the first electronic device is started, the camera of the first electronic device can start to collect the image, and the perception engine module can perceive scene perception information in the scene where the image is located. For example, the scene perception information may include decision information for perceiving an algorithm pattern used for decision photographing, information acquired by a sensor for anti-shake processing, information acquired by a sensor for brightness perception, or information acquired by a sensor for temperature perception, etc.
S303, deciding a collection strategy and a processing strategy.
In the embodiment of the application, the decision-making and collecting strategy of the first electronic device can be used for deciding which camera of the first electronic device is used for shooting the image according to the scene perception information.
Illustratively, the camera of the first electronic device may include a main camera, a tele camera, a wide camera, or the like, and the focal length corresponding to each camera is different. For example, the focal length of the main camera may be 23mm, the focal length of the telephoto camera may be 12mm, and the focal length of the wide-angle camera may be 69mm.
In the embodiment of the application, the processing strategy of the first electronic device can be used for deciding which algorithm to use to process the image acquired by the camera according to the scene perception information.
For example, if the sensing engine of the first electronic device senses that the ambient light where the image collected by the camera is located is darker, the night scene algorithm may be used to process the image collected by the camera.
S304, the camera acquires an image.
For example, the first electronic device may determine to take an image with the main camera according to the scene sensing information, and the image collected by the main camera may be 5 RAW images.
S305, caching the first information.
In the embodiment of the application, the first information may include an image acquired by a camera, metadata information of the image, decision information and scene perception information when the image is shot. The metadata information of the image is information for determining the quality of the image, and may include an exposure value, a gain value, and the like. The decision information is an algorithmic mode used by the first electronic device for decision photographing, for example, the decision information may include information for deciding a high range dynamic (HIGH DYNAMIC RANGE, HDR) mode, portrait mode, large aperture mode, wide angle mode, or the like.
After the camera of the first electronic device collects the image, the first electronic device can cache the image collected by the camera, metadata information of the image and scene perception information when the image is shot, so that the subsequent second electronic device can conduct abnormal reproduction of the image according to the cached data, and therefore abnormal image is located.
In other embodiments, the first electronic device may further write the image acquired by the camera, metadata information of the image, and scene perception information when the image is captured, in a high-intensity compressed manner, to the captured image.
In the embodiment of the application, S306-S308 can be executed in a subsequent cycle, so that whether a plurality of processes are abnormal or not can be detected one by one.
S306, processing in flow N.
In the embodiment of the application, the first electronic device can process the image acquired by the camera, and the flow of processing the image can comprise a plurality of flows. For example, the process of processing an image may include: decompression, noise reduction, sharpness enhancement, highlight compression, face region processing, blurring, demosaicing, dead point removal, white balance processing, auto exposure (automatic exposure, AC) control, auto Focus (AF) control, sharpening, color correction, color enhancement, or image compression.
The process N may be any one of the above processes, for example, the process N may be a process of enhancing definition, and the first electronic device may perform an enhancing definition process on the RAW image and output a YUV image; or the process N may be a process flow of a face area, and the first electronic device may process the face area for the RAW image and output a YUV image.
Optionally, the process N may be an IQA module configured for the first electronic device to perform a process that has a large functionality and is easy to be abnormal, for example, a noise reduction process, a sharpness improvement process, a highlight compression process, and a face area processing process; IQA module is not set for the process with small functionality and uneasy occurrence of abnormality, such as decompression process.
S307, an image quality evaluation IQA module.
In the embodiment of the application, the IQA module of the first electronic device is used for detecting whether the image output by the image processing flow is abnormal or not.
In one possible implementation, the IQA module may detect whether the image processed by the process N is abnormal based on an image detection algorithm.
Illustratively, the IQA module of the first electronic device may detect the image anomaly by at least one algorithmic approach. For example, a histogram detection algorithm: the histogram detection algorithm can judge whether the image is abnormal or not by detecting whether the brightness and the color distribution of the image are uniform or not; fuzzy detection algorithm: for an image, the blur detection algorithm can judge whether the image is abnormal or not through factors such as the definition of the image, the sharpness of edges, the richness of details and the like; structural metrics (structural similarity index, SSIM) algorithm: the method can be used for measuring the similarity of two images, the asynchronous detection module can buffer the thumbnail, compare the images shot by the camera with the thumbnail, and judge whether the images are abnormal according to the difference.
For example, the first electronic device may detect the image processed by the process in S306 through the histogram detection algorithm of the IQA module, for example, may detect whether the brightness and the color distribution of the YUV image output by the process for improving the definition in S306 are uniform, so as to determine whether the process for improving the definition is abnormal, for example, if the brightness and the color distribution of the YUV image are uniform, the process for improving the definition is not abnormal, and if the brightness and the color distribution of the YUV image are not uniform, the process for improving the definition is abnormal; the first electronic device may further detect the image processed by the procedure in S306 through the fuzzy detection algorithm of the IQA module, for example, may detect the sharpness, the sharpness and the richness of details of the YUV image output by the processing procedure of the face area in S306, so as to determine whether the processing procedure of the face area is abnormal, for example, if the sharpness, the sharpness and the richness of details of the YUV image are poor, the processing procedure of the face area is abnormal, and if the sharpness, the sharpness and the richness of details of the YUV image are good, the processing procedure of the face area is not abnormal.
Of course, the IQA module may also implement detection of abnormal scenes including, but not limited to, a flower map, a black map, color cast, excessive noise, consistency, algorithm single frames, and the like, which are not described herein.
In another possible manner, the IQA module of the first electronic device may subscribe to the abnormal data reported by the flow N, and if the abnormal data reported by any flow is obtained, the IQA module may also identify the image output by the flow as an abnormal image.
It can be understood that when the IQA module of the first electronic device detects the image output by the image processing flow, the other process in progress of the first electronic device may not be closed, but a thread is additionally provided to detect the flow of the image acquired by the camera. This may not affect the processing of other processes of the electronic device.
S308, detecting whether the result is abnormal.
In one possible manner, if any one or more of the anomalies are present, S309 is performed, first information of the first electronic device is stored locally to the first electronic device.
S309, determining that the dump switch is open.
In the embodiment of the application, the dump switch of the first electronic device is used for storing the first information of the first electronic device to the local of the first electronic device.
Illustratively, the dump switch may include two states, on and off. The first electronic device allows the first information to be stored locally to the first electronic device when the dump switch is in an on state, and does not allow the first information to be stored locally to the first electronic device when the dump switch is in an off state.
The dump switch of the first electronic device is in a default open state before photographing starts, or a user can open the dump switch in a setting application of the first electronic device before photographing starts according to requirements.
S310, the first information is stored locally.
In the embodiment of the application, when the IQA module detects that the image output by the image processing flow is abnormal, the first electronic device can store the first information to the local of the first electronic device.
S311, shooting is finished.
S312, the user actively requests.
S313, photographing the image and the first information uploading server.
In the embodiment of the application, when the IQA module of the first electronic device detects that the image output by the image processing flow is abnormal, the first electronic device can store the first information to the local of the first electronic device, and then photographing is finished.
In this case, whether the first electronic device can upload the first information and the photographed image to the server may be determined by the active demand of the user. For example, the user's active needs may include the following two possible scenarios.
In one possible case, the first electronic device may prompt the user whether to upload the image in which the first electronic device detects the abnormality to the server, and if the user confirms the uploading, the first electronic device may upload the first information and the photographed image to the server.
In another possible case, the first electronic device may provide an android application package (android application package, APK) or a connection for uploading the abnormal image, the user may select one image 1 and upload the image 1 to the server through the APK or the connection, which may be considered as an active requirement of the user, and the first electronic device may upload the image 1 acquired by the camera, metadata information of the image 1, scene perception information when the image 1 is captured, and the image 1 to the server.
S314, the first information is automatically aged.
In the embodiment of the application, after S301-S313 are completed, the first electronic device can automatically age the cached first information. For example, when the cache is full, the data stored first in the cache can be automatically cleaned.
In the image shooting data storage process in the image detection method, after the shooting of the first electronic device is started, the image acquired by the camera, the metadata information of the image and the scene perception information when the image is shot can be cached, and the image output through the image processing flow can be detected. When the image output by the image processing flow is detected to be abnormal, the image acquired by the camera, the metadata information of the image and the scene perception information when the image is shot are stored locally in the first electronic equipment. After photographing is finished, the first information and the photographed image can be uploaded to the server according to the user requirement. Therefore, when the subsequent user reports that the image abnormality needs to be located, the second electronic equipment for locating the image abnormality can acquire data such as scene perception information and the like when the image abnormality is located, and the image abnormality is reproduced by using the data.
The image anomaly reproduction process of the second stage is described below by way of possible implementation processes.
Illustratively, fig. 4 shows a flowchart of an image anomaly reproduction process of an embodiment of the present application. The process of carrying out image anomaly reproduction by the corresponding second electronic equipment and the upper computer comprises the following steps:
S401, abnormal reproduction starts.
And S402, uploading the first information to the second electronic equipment.
S403, the second electronic equipment is connected with the upper computer.
S404, the upper computer detects the mobile phone version to perform data conversion.
S405, the converted data is imported into the second electronic device.
In the embodiment of the application, after the abnormal reproduction starts, the first information can be transmitted to the second electronic equipment through the first electronic equipment, the second electronic equipment is connected with the upper computer, the second electronic equipment can transmit the first information to the upper computer, the upper computer can convert the first information into a form which can be matched with the version of the second electronic equipment by detecting the version information of the second electronic equipment, and the first information after the conversion of the line form is imported into the second electronic equipment.
It should be understood that the first electronic device and the second electronic device may be the same type of electronic device, or may be different types of electronic devices, which is not limited in the embodiment of the present application.
S406, the upper computer triggers the second electronic equipment to take a picture.
For example, the upper computer may send an instruction for triggering the photographing button to the second electronic device. It should be noted that, the triggering photographing in S406 is not triggering the actual photographing, but may be understood as triggering a process of simulating the first electronic device to obtain the abnormal image by using the first information, so as to obtain the abnormal image later.
S407, simulating scene perception, and determining a photographing strategy.
In the embodiment of the application, scene perception is simulated, and the photographing strategy is determined, which can be understood as determining the photographing strategy based on the scene perception information in the first information instead of determining the photographing strategy by adopting the perception information in the current scene of the second electronic equipment, so that the scene when the first electronic equipment photographs to obtain the abnormal image can be reproduced.
It should be appreciated that the second electronic device has no requirement on the scene of the image anomaly reproduction, and the second electronic device may restore the scene in the first information of the first electronic device across temporal and spatial constraints.
S408, replacing the image, scene perception information and metadata information acquired by the camera.
It can be understood that after receiving an instruction for triggering photographing, the second electronic device starts the camera to collect an image, but in the embodiment of the application, the real image collected by the camera is not processed, but the image collected by the camera is replaced by the original image in the first information, and the metadata information of the image is replaced by the metadata information in the first information. In this way, a specific image when the first electronic device captures an abnormal image can be reproduced.
In the embodiment of the application, S409-S412 can be executed in a subsequent cycle to realize the reproduction of a plurality of flow processing conditions.
S409, flow N processing.
In the embodiment of the present application, the process N may refer to the description of the process N in fig. 3, and is not described herein.
S410, whether the abnormality is recurring.
For example, if any flow detects an anomaly in the simulation, the anomaly may be considered to be recurring, so that the flow may be subsequently optimized based on the recurring anomaly to resolve the anomaly.
S411, flow modification.
In the embodiment of the application, if the problem of image abnormality is reproduced, the second electronic device can modify the flow in the simulated photographing flow.
The modification refers to modifying an image processing algorithm in an image processing flow, such as a mean filtering noise elimination algorithm, a laplace operator algorithm, an image enhancement algorithm, a face recognition algorithm, a target tracking algorithm, or a background blurring algorithm.
Illustratively, the object tracking algorithm is a 4-channel fusion, but the image ghost map under the 4-channel fusion is too different. Therefore, the second electronic device can change the 4 channels into single channels through the modification algorithm, so that the image anomaly is modified.
S412, whether the exception is handled.
In the embodiment of the present application, after the abnormal flow is modified in S411, the second electronic device may perform S405 to S410, and if the abnormality is not repeated, it indicates that the abnormality has been modified. If the exception is repeated, the exception is not modified, and the flow modification is continued.
S413, the abnormality analysis ends.
It should be noted that, the image detection method in the embodiment of the present application may be implemented in a HAL layer, or may be implemented in another software architecture layer of the electronic device, which is not limited by the embodiment of the present application.
It should be appreciated that, in some embodiments, the first electronic device and the upper computer may also perform image anomaly reproduction according to the first information; or, the second electronic device may perform abnormal image reproduction according to the first information; or, the first electronic device may perform image anomaly reproduction according to the first information. The specific implementation process is similar to the above, and will not be described in detail here.
In the image anomaly reproduction process in the image detection method of the embodiment of the application, the second electronic device can determine a photographing strategy according to scene perception information in the first information, and can simulate a photographing process of the first electronic device after replacing an image acquired by the camera with an original image obtained based on the image in the first information, the scene perception information and metadata information, and process the process N of the original image to determine anomalies in the process of processing the original image; the second electronic device may also modify the abnormal flow to obtain a normal image. Therefore, labor cost is saved to a large extent, and efficiency of locating and processing image anomalies is improved.
Optionally, the image detection method of the embodiment of the present application may be applicable to photographing of multiple scenes, and may include the following scenes: the embodiment of the present application is not limited to this, and may be multi-shot, zoom, portrait mode, large aperture, professional mode, in-recording, or front-end.
The method according to the embodiment of the present application will be described in detail by way of specific examples. The following embodiments may be combined with each other or implemented independently, and the same or similar concepts or processes may not be described in detail in some embodiments.
Fig. 5 shows an image detection method of an embodiment of the present application. The method may be applied to a first electronic device, the method comprising:
S501, the first electronic equipment responds to photographing operation, and caches first information, wherein the first information comprises images acquired by a camera, metadata information of the images and scene perception information when the images are photographed; the image and metadata information are used for simulating an original image when the image is shot, and the scene perception information is used for simulating the environment when the image is shot.
In the embodiment of the application, before the first electronic device caches the first information, the first electronic device can perceive scene perception information in a scene where the image acquired by the camera is located based on the perception engine module, can determine which camera is used by the first electronic device to shoot the image according to the scene perception information, and can determine which image processing algorithm is used to process the image shot by the camera. The camera of the first electronic device may be understood as the camera of the first electronic device in step S303 corresponding to the above-mentioned fig. 3, and will not be described again. The image processing algorithm may be understood as an example of the image processing algorithm in step S303 corresponding to the above-mentioned fig. 3, and will not be described in detail.
S502, when the flow of processing the image is wrong, first information is stored locally in the first electronic equipment.
In a possible implementation, the first electronic device may collect an image according to the determined camera, and cache the image collected by the camera, metadata information of the image, and scene perception information when the image is captured. The first electronic device can detect and process the flow of the image collected by the camera through the IQA module, and if the dump switch of the first electronic device is in an on state under the condition of detecting and processing the abnormal flow of the image collected, the first electronic device can store the image collected by the camera, the metadata information of the image and the scene perception information when the image is shot to the local of the first electronic device. The process of processing the image acquired by the camera may be understood as the process N corresponding to the step S306 in fig. 3, which is not repeated. The IQA module may be understood as the IQA module in the step S307 corresponding to the above-mentioned fig. 3, and will not be described again.
According to the image detection method provided by the embodiment of the application, after the photographing of the first electronic device is started, the image collected by the camera, the metadata information of the image and the scene perception information when the image is photographed can be cached, and the image output through the image processing flow can be detected through the IQA module. When the image output by the image processing flow is detected to be abnormal, the image acquired by the camera, the metadata information of the image and the scene perception information when the image is shot are stored locally in the first electronic equipment. In this way, the subsequent second electronic device may implement scene reproduction of the above-described photographing operation of the first electronic device according to the locally stored data of the first electronic device, and may locate an abnormality in the photographing operation in the field Jing Fu.
In some embodiments, the first electronic device may further compress and write metadata information of the image and scene perception information when the image is captured into the captured image, and upload the captured image to the server according to a user requirement. Therefore, if the user later reports that the photographed image is abnormal, after-sales personnel can conveniently obtain the metadata information of the image and scene perception information when the image is photographed, so that abnormal reproduction of the image is performed.
Optionally, on the basis of the embodiment corresponding to fig. 5, the scene perception information includes one or more of the following: decision information for deciding an algorithm mode used for photographing, information collected by a sensor for anti-shake processing, information collected by a sensor for brightness sensing, information collected by a sensor for temperature sensing, or information collected by a sensor for distance sensing.
In the embodiment of the application, the first electronic device can sense scene sensing information around the image acquired by the camera through the sensing engine module. The perception engine module may be understood as the perception engine module in the corresponding step S302 in fig. 3, and will not be described again. It can be appreciated that saving the scene perception information is beneficial to restoring the scene of the abnormal image.
Optionally, on the basis of the embodiment corresponding to fig. 5, the first information includes decision information for deciding an algorithm mode used for photographing. It can be appreciated that the decision information is saved, which is beneficial to restoring the scene of the abnormal image.
Optionally, on the basis of the embodiment corresponding to fig. 5, the flow of processing the image includes one or more of the following: decompression flow, noise reduction flow, sharpness improvement flow, highlight compression flow, face region processing flow, blurring flow, demosaicing flow, dead point removal processing flow, white balance processing flow, AC control flow, AF control flow, sharpening processing flow, color correction flow, color enhancement flow or image compression flow. It will be appreciated that the specific implementation of the processes for processing the images and the determination manner of whether each process is abnormal may be any implementation manner, and the embodiment of the present application is not limited in particular.
For example, the algorithm used in the process of processing the image may include: decompression algorithm, mean filtering noise elimination algorithm, laplacian algorithm, image enhancement algorithm, face recognition algorithm, background blurring algorithm, demosaicing algorithm, dead point removal processing algorithm, white balance processing algorithm, AC control algorithm, AF control algorithm, sharpening processing algorithm, color correction algorithm, color enhancement algorithm or Jpeg image compression algorithm.
It can be appreciated that the process of processing the image acquired by the camera by the first electronic device is completed based on the above algorithms, and errors may occur in the operation process of the algorithms. Therefore, the image processing flow is also a flow which is easy to generate errors in image processing, and the image output by the flow is detected to be abnormal, so that the reason for the abnormal image can be accurately obtained.
Optionally, before storing the first information locally in the first electronic device, on the basis of the embodiment corresponding to fig. 5, the method further includes: after the first flow is executed, abnormality detection is carried out on the processed image obtained after the first flow is executed; when an abnormality is detected in the processed image, a first flow Cheng Chucuo is indicated, and the first flow is any one of the flows of processing the image.
In the embodiment of the present application, the first electronic device may be provided with an IQA module in the processing flow, and perform abnormality detection on the processed image obtained after the image processing flow is performed, where the abnormality detection on the processed image obtained after the image processing flow is performed by the IQA module may be understood as an example in the IQA module in the step S307 corresponding to the foregoing fig. 3, which is not described herein. In this way, the cause of the occurrence of the image abnormality is facilitated to be determined later.
Optionally, on the basis of the embodiment corresponding to fig. 5, the first procedure includes one or more of the following: noise reduction flow, sharpness improvement algorithm flow, highlight pressing flow or face region processing flow.
In the embodiment of the present application, the flow of processing graphics includes the first flow, which may be understood as a flow in which an abnormality easily occurs.
For example, the first electronic device may set an IQA module for a process of processing an image, which is large in functionality and easy to be abnormal, for example, a noise reduction process, a sharpness improvement process, a high-light compression process, and a process of a face region, and it is understood that these process processes are processes with high error frequency in practice, and if the IQA module is set for these processes, the abnormal image can be determined more accurately through a small amount of detection.
Illustratively, the IQA module is not provided for a process of processing an image that is small in functionality and not prone to occurrence, such as a decompression process, which may reduce the operating load of the first electronic device.
Optionally, on the basis of the embodiment corresponding to fig. 5, a dump switch is disposed in the first electronic device, where the first electronic device allows the data in the cache to be stored in the local area of the first electronic device when the dump switch is in an on state, and the first electronic device does not allow the data in the cache to be stored in the local area of the first electronic device when the dump switch is in an off state; storing the first information locally at the first electronic device, including: the first information is stored locally on the first electronic device with the dump switch in an open state.
In the embodiment of the present application, the dump switch may be understood as the dump switch in step S309 corresponding to the above-mentioned fig. 3, and will not be described again. Therefore, a user can control the dump switch, when the dump switch is in an on state, data in the first electronic equipment cache is stored locally, and correspondingly, when the dump switch is in an off state, the data in the first electronic equipment cache can not be stored locally, so that the user requirement can be met, and the privacy of the user can be better protected.
Optionally, after storing the first information locally in the first electronic device, on the basis of the embodiment corresponding to fig. 5, the method further includes: obtaining a photographed image; in the case where an operation for uploading the photographed image to the server is received, both the first information and the photographed image are uploaded to the server.
In the embodiment of the application, when the process of processing the image acquired by the camera is abnormal, the first electronic equipment can also determine whether to upload the first information and the photographed image to the server according to the active demand of the user after locally storing the first information through the dump switch. The user active demand may be understood as the user active demand in step S312 in fig. 3, and is not described herein.
In this way, the electronic device for scene reproduction can download the first information and the photographed image of the first electronic device through the server, and when the user using the first electronic device wants to solve the problem of image abnormality, the user can operate on line by using the first electronic device, and communication connection between the first electronic device and the scene reproduction device is not required to be established. Furthermore, the server can collect abnormal images and related information of a plurality of users to analyze big data, which is beneficial to optimizing photographing software.
In some embodiments, after the first electronic device uploads the cached data to the server, the cached data may be automatically aged, and the automatic aging may be understood as the automatic aging of the first information in step S314 corresponding to the above-mentioned fig. 3, which is not described again.
Optionally, after storing the first information locally on the first electronic device, based on the embodiment corresponding to fig. 5, the method further includes: simulating photographing processes by using the first information to obtain processing results of the photographing processes; in the simulated photographing process, a photographing strategy is determined based on scene perception information in the first information, and an image acquired by a camera is replaced by an original image obtained based on the image in the first information, the scene perception information and metadata information.
In the embodiment of the present application, the second electronic device may determine, by using the first information, a photographing policy required for reproducing the abnormal image, where the photographing policy may be understood as the photographing policy in step S407 corresponding to the foregoing fig. 4, which is not described in detail.
In a possible implementation, after receiving an instruction for triggering photographing, the second electronic device may start the camera to collect an image, and the second electronic device may not process the real image collected by the camera, but replace the image collected by the camera with an original image in the first information, and replace metadata of the image with metadata in the first information. In this way, the specific image when the first electronic device captures the abnormal image can be reproduced.
Optionally, based on the embodiment corresponding to fig. 5, the simulating the photographing procedure with the first information is performed by the first electronic device or the second electronic device.
In the embodiment of the application, the first electronic device and the second electronic device can be electronic devices with different models. The first electronic device may be an electronic device that captures an abnormal image, and the photographing procedure may be simulated using the first electronic device. And under the condition that the first electronic equipment cannot simulate the photographing flow, the photographing flow can be simulated by the second electronic equipment. Therefore, the photographing process can be flexibly simulated by using the first electronic device or the second electronic device according to the actual available condition of the first electronic device or the second electronic device.
Optionally, on the basis of the embodiment corresponding to fig. 5, the simulating the photographing procedure by using the first information is performed by the second electronic device, and before the simulating the photographing procedure by using the first information, the method includes: converting the first information into a form which can be matched with the version of the second electronic equipment; and importing the first information after the conversion into the second electronic equipment.
In the embodiment of the application, the first information may be obtained by the first electronic device, and the simulation photographing procedure using the first information is performed by the second electronic device. The first electronic device and the second electronic device may be different models of electronic devices, and thus, it is necessary to convert the first information into a form that the version of the second electronic device can match. Therefore, the second electronic equipment can simulate the photographing flow by using the converted first information, and the image abnormality is reproduced.
In a possible implementation, taking the example that the image anomaly reproduction process is implemented by controlling the second electronic device by the upper computer. The upper computer can perform data conversion on the first information, the converted data is led into the second electronic device, and the data interaction process between the upper computer and the second electronic device can be understood as the data interaction between the upper computer and the second electronic device in the steps S402 to S405 corresponding to the above-mentioned fig. 4, which is not repeated.
In some embodiments, the process of reproducing the image anomaly occurring at the first electronic device may be performed by the first electronic device; or, the process of reproducing the image abnormality of the first electronic device may be completed by controlling the first electronic device by the upper computer. The embodiment of the present application is not limited thereto.
Optionally, on the basis of the embodiment corresponding to fig. 5, after the photographing process is simulated by using the first information to obtain the processing result of each photographing process, the method further includes: modifying a flow of simulating a photographing flow by using the first information; and (3) adopting the first information and the modified simulated photographing flow to carry out simulated photographing again until the result of simulated photographing is abnormal.
In the embodiment of the application, the second electronic device can complete the reproduction of the image shot by the first electronic device and can also modify the abnormal flow until the result of the simulated shooting is not abnormal. The flow modification may be understood as the flow modification in the corresponding step S411 in fig. 4, and will not be described again. Therefore, labor cost is saved to a large extent, and efficiency of locating and processing image anomalies is improved.
In some embodiments, if the second electronic device fails to reproduce the image captured by the first electronic device, the second electronic device does not reproduce the image anomaly any more, and does not modify the anomaly procedure.
The names of the modules according to the embodiments of the present application may be defined as other names, so that the functions of each module may be achieved, and the names of the modules are not specifically limited.
It should be noted that, the user information (including but not limited to user equipment information, user personal information, etc.) and the data (including but not limited to data for analysis, stored data, presented data, etc.) related to the embodiments of the present application are information and data authorized by the user or fully authorized by each party, and the collection, use and processing of the related data need to comply with the related laws and regulations and standards of the related country and region, and provide a corresponding operation entry for the user to select authorization or rejection.
The image detection method according to the embodiment of the present application has been described above, and the device for performing the method according to the embodiment of the present application is described below. It will be appreciated by those skilled in the art that the methods and apparatus may be combined and referred to, and that the related apparatus provided by the embodiments of the present application may perform the steps in the methods for ordering lists described above.
Fig. 6 is a schematic structural diagram of a chip according to an embodiment of the present application. Chip 610 includes one or more (including two) processors 601, communication lines 602, communication interfaces 603, and memory 604.
In some implementations, the memory 604 stores the following elements: executable modules or data structures, or a subset thereof, or an extended set thereof.
The method described in the above embodiments of the present application may be applied to the processor 601 or implemented by the processor 601. The processor 601 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 601 or instructions in the form of software. The processor 601 may be a general purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gates, transistor logic, or discrete hardware components, and the processor 601 may implement or perform the methods, steps, and logic diagrams related to the disclosed processes in the embodiments of the present application.
The steps of the method disclosed in connection with the embodiments of the present application may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in any well-known storage medium such as ram, rom, or EEPROM (ELECTRICALLY ERASABLE PROGRAMMABLE READ ONLY MEMORY, EEPROM). The storage medium is located in a memory 604, and the processor 601 reads information in the memory 604 and in combination with its hardware performs the steps of the method described above.
The processor 601, the memory 604 and the communication interface 603 may communicate with each other via a communication line 602.
The image detection method provided by the embodiment of the application can be applied to the electronic equipment with the communication function. The electronic device includes a terminal device, and specific device forms and the like of the terminal device may refer to the above related descriptions, which are not repeated herein.
The image detection method provided by the embodiment of the application can be applied to the electronic equipment with the communication function. The electronic device includes a terminal device, and specific device forms and the like of the terminal device may refer to the above related descriptions, which are not repeated herein.
The embodiment of the application provides electronic equipment, which comprises: comprising the following steps: a processor and a memory; the memory stores computer-executable instructions; the processor executes the computer-executable instructions stored in the memory to cause the electronic device to perform the method described above.
The embodiment of the application provides a chip. The chip comprises a processor for invoking a computer program in a memory to perform the technical solutions in the above embodiments. The principle and technical effects of the present application are similar to those of the above-described related embodiments, and will not be described in detail herein.
The embodiment of the application also provides a computer readable storage medium. The computer-readable storage medium stores a computer program. The computer program realizes the above method when being executed by a processor. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer readable media can include computer storage media and communication media and can include any medium that can transfer a computer program from one place to another. The storage media may be any target media that is accessible by a computer.
In one possible implementation, the computer readable medium may include RAM, ROM, a compact disk-read only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium targeted for carrying or storing the desired program code in the form of instructions or data structures and accessible by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (Digital Subscriber Line, DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes optical disc, laser disc, optical disc, digital versatile disc (DIGITAL VERSATILE DISC, DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application provide a computer program product comprising a computer program which, when executed, causes a computer to perform the above-described method.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The foregoing detailed description of the invention has been presented for purposes of illustration and description, and it should be understood that the foregoing is by way of illustration and description only, and is not intended to limit the scope of the invention.
Claims (13)
1. An image detection method, comprising:
Responding to photographing operation, and caching first information, wherein the first information comprises an image acquired by a camera, metadata information of the image and scene perception information when the image is photographed; the image and the metadata information are used for simulating an original image when the image is shot, the scene perception information is used for simulating an environment when the image is shot, and the metadata information of the image is information for determining the quality of the image;
when the process of processing the image is wrong, storing the first information in the local of first electronic equipment, wherein the first information is used for carrying out abnormal reproduction of the image;
after the storing the first information locally on the first electronic device, the method further comprises:
simulating a photographing flow by using the first information to obtain a processing result of each photographing flow;
In the simulated photographing flow, a photographing strategy is determined based on scene perception information in the first information, and an image acquired by a camera is replaced by an original image obtained based on the image in the first information, the scene perception information and the metadata information;
after the photographing processes are simulated by using the first information and the processing results of the photographing processes are obtained, the method further comprises the steps of:
modifying the flow of the simulated photographing flow by using the first information;
And carrying out simulated photographing again by adopting the first information and the modified simulated photographing flow until the result of the simulated photographing is abnormal.
2. The method of claim 1, wherein the scene-aware information includes one or more of: decision information for deciding an algorithm mode used for photographing, information collected by a sensor for anti-shake processing, information collected by a sensor for brightness sensing, information collected by a sensor for temperature sensing, or information collected by a sensor for distance sensing.
3. The method of claim 1, wherein the first information includes decision information for deciding an algorithm mode used for photographing.
4. A method according to any one of claims 1 to 3, wherein the process of processing the image comprises one or more of: decompression flow, noise reduction flow, sharpness improvement flow, highlight compression flow, face region processing flow, blurring flow, demosaicing flow, dead point removal processing flow, white balance processing flow, automatic exposure AC control flow, automatic focusing AF control flow, sharpening processing flow, color correction flow, color enhancement flow or image compression flow.
5. A method according to any one of claims 1 to 3, wherein the storing the first information prior to locally storing the first information at the first electronic device, further comprises:
After the first flow is executed, performing anomaly detection on the processed image obtained after the first flow is executed;
and when the processed image detects an abnormality, representing that a first flow is in error, wherein the first flow is any one of the flows for processing the image.
6. The method of claim 5, wherein the first procedure comprises one or more of: noise reduction flow, sharpness improvement algorithm flow, highlight pressing flow or face region processing flow.
7. A method according to any one of claims 1 to 3, wherein a dump switch is provided in the first electronic device, the first electronic device allowing the data in the cache to be stored locally to the first electronic device if the dump switch is in an on state, and not allowing the data in the cache to be stored locally to the first electronic device if the dump switch is in an off state;
the storing the first information locally at the first electronic device includes:
And storing the first information locally to the first electronic device when the dump switch is in an open state.
8. A method according to any one of claims 1 to 3, wherein after storing the first information locally at the first electronic device, further comprising:
Obtaining a photographed image;
and uploading the first information and the photographed image to the server when an operation for uploading the photographed image to the server is received.
9. A method according to any one of claims 1-3, wherein said simulating a photographing procedure using said first information is performed by said first electronic device or second electronic device.
10. The method of claim 9, wherein the simulating the photographing procedure using the first information is performed by the second electronic device, and wherein the simulating the photographing procedure using the first information is preceded by:
converting the first information into a form which can be matched with the version of the second electronic equipment;
And importing the first information after the conversion into the second electronic equipment.
11. An electronic device, comprising: a processor and a memory;
the memory stores computer-executable instructions;
the processor executing computer-executable instructions stored in the memory to cause the electronic device to perform the method of any one of claims 1-10.
12. A computer readable storage medium storing a computer program, which when executed by a processor implements the method according to any one of claims 1-10.
13. A system on a chip comprising at least one processor and a communication interface, the communication interface and the at least one processor being interconnected by a wire, the at least one processor being configured to execute a computer program or instructions to perform the method of any of claims 1-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311826441.2A CN117474926B (en) | 2023-12-28 | 2023-12-28 | Image detection method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311826441.2A CN117474926B (en) | 2023-12-28 | 2023-12-28 | Image detection method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117474926A CN117474926A (en) | 2024-01-30 |
CN117474926B true CN117474926B (en) | 2024-09-03 |
Family
ID=89629711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311826441.2A Active CN117474926B (en) | 2023-12-28 | 2023-12-28 | Image detection method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117474926B (en) |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117135341A (en) * | 2023-01-19 | 2023-11-28 | 荣耀终端有限公司 | Image processing method and electronic equipment |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114845049A (en) * | 2022-04-07 | 2022-08-02 | 展讯通信(上海)有限公司 | Image simulation method, system, medium, and electronic device |
CN114640798B (en) * | 2022-05-09 | 2022-10-04 | 荣耀终端有限公司 | Image processing method, electronic device, and computer storage medium |
CN114721968B (en) * | 2022-06-02 | 2022-09-02 | 龙旗电子(惠州)有限公司 | Test method, test device and storage medium |
CN116051361B (en) * | 2022-06-30 | 2023-10-24 | 荣耀终端有限公司 | Image dimension data processing method and device |
CN116055712B (en) * | 2022-08-16 | 2024-04-05 | 荣耀终端有限公司 | Method, device, chip, electronic equipment and medium for determining film forming rate |
-
2023
- 2023-12-28 CN CN202311826441.2A patent/CN117474926B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117135341A (en) * | 2023-01-19 | 2023-11-28 | 荣耀终端有限公司 | Image processing method and electronic equipment |
Also Published As
Publication number | Publication date |
---|---|
CN117474926A (en) | 2024-01-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230217097A1 (en) | Image Content Removal Method and Related Apparatus | |
CN111131698B (en) | Image processing method and device, computer readable medium and electronic equipment | |
CN113411498B (en) | Image shooting method, mobile terminal and storage medium | |
CN110569927A (en) | Method, terminal and computer equipment for scanning and extracting panoramic image of mobile terminal | |
CN113810604B (en) | Document shooting method, electronic device and storage medium | |
JP2022002376A (en) | Image processing apparatus, image processing method, and program | |
CN115359105B (en) | Depth-of-field extended image generation method, device and storage medium | |
WO2023137956A1 (en) | Image processing method and apparatus, electronic device, and storage medium | |
CN113744139A (en) | Image processing method, image processing device, electronic equipment and storage medium | |
CN106851099B (en) | A kind of method and mobile terminal of shooting | |
CN114187172A (en) | Image fusion method and device, computer equipment and computer readable storage medium | |
CN117474926B (en) | Image detection method and device | |
CN115914860A (en) | Shooting method and electronic equipment | |
JP2012527801A (en) | Method and apparatus for capturing a digital image | |
CN115460343B (en) | Image processing method, device and storage medium | |
CN113781341B (en) | Image processing method, device, electronic equipment and storage medium | |
US20230014272A1 (en) | Image processing method and apparatus | |
US10282633B2 (en) | Cross-asset media analysis and processing | |
CN115686182B (en) | Processing method of augmented reality video and electronic equipment | |
CN112118394B (en) | Dim light video optimization method and device based on image fusion technology | |
CN117097993B (en) | Image processing method and related device | |
CN117496391A (en) | Image processing method and electronic equipment | |
CN117440253B (en) | Image processing method and related device | |
CN116363017B (en) | Image processing method and device | |
CN117119318B (en) | Image processing method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |