Detailed Description
Referring to fig. 1, it can be seen in a schematic structural diagram of an atmosphere lamp device provided in an embodiment of the present application that the atmosphere lamp device includes a controller 1, an atmosphere lamp 2, and an image acquisition interface, where the atmosphere lamp 2 is electrically connected with the controller 1, so as to receive a cooperative control of a computer program running in the controller 1, and realize lamp-effect playing.
The controller 1 typically includes a control chip, communication components, and bus connectors, and in some embodiments, the controller 1 may also configure power adapters, control panels, display screens, etc. as desired.
The power adapter is mainly used for converting commercial power into direct current so as to supply power for the whole atmosphere lamp equipment. The control Chip may be implemented by various embedded chips, such as a bluetooth SoC (System on Chip), a WiFi SoC, an MCU (Micro Controller Unit, a microcontroller), a DSP (Digital Signal Processing ), and the like, and generally includes a central processor and a memory, where the memory and the central processor are respectively used to store and execute program instructions to implement corresponding functions. The control chips of the above types can be used for communication components from the outside, and can be additionally configured according to the requirement. The communication component may be used for communication with an external device, for example, may be used for communication with a terminal device such as a personal computer or various smartphones, so that after a user issues various configuration instructions through its terminal device, the control chip of the controller 1 may receive the configuration instructions through the communication component to complete the basic configuration, so as to control the atmosphere lamp to work. In addition, the controller 1 can also acquire an interface image of the terminal device through the communication component, or acquire a real-time preview image acquired by the camera. The bus connector is mainly used for connecting the atmosphere lamp 2 connected to the bus with a power supply and providing a lamp effect playing instruction, so that pins corresponding to the power bus and the signal bus are correspondingly provided, and therefore, when the atmosphere lamp 2 needs to be connected to the controller 1, the atmosphere lamp is connected with the bus connector through the corresponding connector of the atmosphere lamp. The control panel typically provides one or more keys for performing on-off control of the controller 1, selecting various preset light effect control modes, etc. The display screen can be used for displaying various control information so as to be matched with keys in the control panel and support the realization of man-machine interaction functions. In some embodiments, the control panel and the display screen may be integrated into the same touch display screen.
Referring to fig. 2, the atmosphere lamp in fig. 2 is configured as a curtain lamp, the atmosphere lamp 2 includes a plurality of light-emitting lamp strips 21 connected to a bus, each light-emitting lamp strip 21 includes a plurality of serially connected lamp beads 210, and the lamp beads 210 of each light-emitting lamp strip 21 are generally the same in number and are arranged at equal intervals. When used, the atmosphere lamp 2 used as a curtain lamp is usually configured such that the respective light-emitting lamp strips 21 are unfolded according to the layout shown in fig. 2, so that all the lamp beads in all the light-emitting lamp strips 21 are arranged in an array to form a lamp bead matrix structure, and the whole lamp beads can provide a picture effect when emitting light cooperatively, so that the whole lamp bead matrix structure forms a display picture 4, a certain pattern effect can be formed within the display picture 4 when playing the lamp effect, a static lamp effect can be formed when a single pattern is displayed statically, and a dynamic lamp effect can be formed when switching the patterns according to time sequence.
Each light-emitting lamp strip 21 can be formed by connecting a plurality of lamp beads 210 in series, each lamp bead 210 is a light-emitting unit, each lamp bead 210 in the same light-emitting lamp strip 21 transmits working current through the same group of cables connected to the bus, and the lamp beads 210 in the same light-emitting lamp strip 21 can be connected in parallel in an electrical connection relationship. In one embodiment, the light-emitting light strips 21 in the same light-strip matrix structure may be disposed at equal intervals along the bus direction, and the light-strips 210 of the light-strip 21 are disposed correspondingly in number and positions, so that the whole display frame 4 plays a role similar to a screen when the light-emitting effect is viewed at a long distance, and can form a pattern effect visually for human eyes.
Similarly, referring to fig. 3, the atmosphere lamp in fig. 3 is laid out around the display of the terminal device to form a frame lamp, and the frame lamp may be surrounded by a single or multiple light-emitting lamp strips connected to the bus. As for the luminous lamp strip adopted by the frame lamp and the lamp beads in the luminous lamp strip, the structure and the communication mechanism are the same as those of the curtain lamp. When the frame lamp is arranged, all the lamp beads are arranged around the display, but the display picture 4 formed on the basis of the lamp bead matrix structure can be regarded as a whole, the lamp beads are not arranged at the central part of the display picture 4, and only the lamp beads are arranged at four sides, so that when the lamp effect is played, a certain light atmosphere effect can be scattered inside and outside the range of the display picture 4.
The controller 1 of the atmosphere lamp device is used for realizing the work control of the whole atmosphere lamp device and is responsible for the communication between the inside and the outside of the whole atmosphere lamp device, wherein the controller 1 is also responsible for driving the image acquisition interface to work, the environment reference image is acquired frame by frame through the image acquisition interface, the environment reference image can be an interface image of the terminal device or a real image of a physical space, then a lamp effect playing instruction of a corresponding frame is generated according to the environment reference image of each frame, and the lamp effect of the curtain lamp playing corresponding frame is controlled through the lamp effect playing instruction.
Each of the light beads 210 of each of the light-emitting light strips 21 of the atmosphere lamp 2 is also provided with a corresponding control chip, and the control chip can select the type according to the disclosure, or select other more economical control chips, which mainly has the function of extracting the light-emitting color value corresponding to the light bead 210 from the light effect playing instruction and controlling the light-emitting elements in the light bead 210 to emit corresponding color light. The light emitting element may be an LED lamp.
The image acquisition interface may be either a hardware interface or a software interface implemented in the controller 1. When the camera is aligned to a target picture, for example, a display desktop of a terminal device, or the camera is aligned to a solid space environment, images are acquired according to a certain frame rate, and interface images can be acquired. In the case of a software interface, the image acquisition interface may be an image acquisition program implemented on the controller 1 side by using a graphics infrastructure technology provided by an operating system of the terminal device, where the controller 1 is connected to the terminal device through various cables, such as HDMI, type-C connection lines, so that the interface image of the terminal device can be continuously obtained under the support of the graphics infrastructure technology; of course, if the controller 1 and the terminal device pre-establish a wireless screen-throwing protocol, the controller 1 may also acquire the interface image of the terminal device by means of wireless communication. The graphics infrastructure technology of the operating system varies according to the type of the operating system, and in an example, in the Windows operating system, a corresponding technology is provided, namely: microsoft DirectX Graphics Infrastructure, DXGI for short, may implement this function.
Therefore, when the image acquisition interface is responsible for acquiring the environment reference image, the specific environment for acquiring the image can be flexibly set by a user, for example, when the image acquisition interface is a camera, the user can shoot the camera aiming at the graphical user interface of the computer to acquire a corresponding interface image as a target picture for playing the light effect, so that the atmosphere lamp 2 can generate the corresponding light effect according to the interface image; the user can also aim the camera at an entity space environment such as an outdoor environment, shoot a live-action image to serve as an environment reference image, so that the atmosphere lamp 2 can play a corresponding lamp effect according to the live-action.
According to the atmosphere lamp equipment, when the atmosphere lamp equipment is electrified, the control chip of the controller can call and execute the computer program from the memory, the atmosphere lamp is electrified and initialized through the default initialization flow of the computer program, and the driving configuration of the atmosphere lamp and other hardware equipment is completed.
In one embodiment, when the atmosphere lamp is started, the controller can firstly send a self-checking instruction to the atmosphere lamp, and drive each lamp bead in each light-emitting lamp strip of the atmosphere lamp to return the position information of the lamp bead in the light-emitting lamp strip. Each lamp bead is provided with a corresponding control chip for carrying out data communication with the control chip in the controller, so that the characteristic information of the lamp bead and the characteristic information of other lamp beads can be serially connected in sequence according to a serial communication protocol, and the representation of the position information of the lamp bead is realized. The serial communication protocol executed between the controller and the lamp beads can be any one of IIC (Inter-Integrated Circuit, integrated circuit bus), SPI (serial peripheral interface ), UART (Universal Asynchronous Receiver-Transmitter, universal asynchronous receiver/Transmitter). After the controller obtains the result data returned by the self-inspection of each lamp bead from the bus, the result data are analyzed, the positions of each lamp bead in the display frame 4 presented by the whole atmosphere lamp can be determined according to the sequence of the characteristic information of each lamp bead in the result data, therefore, each lamp bead can be used as a light-emitting unit and can be understood as a basic pixel, and when a subsequent controller constructs a lamp effect playing instruction, the corresponding light-emitting color value of each basic pixel can be set according to the actual requirement according to the position information of each lamp bead.
After the initialization is completed, the controller can continuously acquire the environment reference image as a target picture through the image acquisition interface, and color taking is carried out on the target picture so as to determine the luminous color value of each luminous unit in the display picture. Therefore, the display frame of the atmosphere lamp can be divided into a plurality of unit frames, the target picture is divided into a plurality of color taking images, the color taking images are in one-to-one correspondence with the unit frames, corresponding color values are determined according to the dominant tone of each color taking image, and then the luminous color values of all luminous units in the unit frames corresponding to the color taking images are generated according to the color values.
Accordingly, in one embodiment, the atmosphere lamp device may preset the frame configuration information corresponding to the atmosphere lamp, where the granularity of the subdivision on the plane of the whole display frame 4 is specified in the frame configuration information, so that the whole display frame 4 may be divided into the plurality of unit frames 40 according to the constraint of the frame configuration information. Typically, as shown in fig. 2, each unit frame 40 may span multiple adjacent light-emitting light bands by dividing, and each light-emitting light band is correspondingly covered across multiple light beads. The setting of the granularity of dividing the display frame 4 in the frame configuration information may be flexibly implemented, for example, dividing the unit frames 40 according to a nine-square grid, a sixteen-square grid, etc. may be specified in the frame configuration information, and the total number of the unit frames, for example, values of 9, 16, etc. may be expressed correspondingly, so that the unit frames 40 are divided for the whole display frame 4 according to the total number of all the lamp beads in the atmosphere lamp according to the set values; the number of light-emitting lamp bands spanned by each unit frame (column number) and the number of light-emitting lamp beads spanned by each unit frame (row number) may be specified in the frame configuration information, and then each corresponding lamp bead 210 covered by the unit frame 40 may be determined according to the corresponding row number and column number; it is also possible to set the total number of unit frames 40 to be generated only in the frame configuration information.
The frame configuration information can be set through a man-machine interaction function realized on the controller, can also be set through terminal equipment which establishes data communication connection with the controller and is transmitted to the controller, and the controller stores the frame configuration information in a memory of a control chip of the controller and calls the frame configuration information according to requirements.
Considering that the light-emitting bands of the atmosphere lamp can be flexibly configured to be increased or decreased as required, in one embodiment, the controller may detect the number of the light-emitting bands first, more specifically, detect the total number of the light-emitting bands of the whole atmosphere lamp, grasp the total amount of the basic pixels of the whole display frame by detecting the total number of the light-emitting bands, understand that the whole display frame of the atmosphere lamp is divided into a matrix of 4*4 unit frames according to the value set in the frame configuration information, for example, the value of 16, calculate the number of columns according to the number of the light-emitting bands, calculate the number of rows according to the number of the light-emitting bands, determine the number of occupied rows and the number of columns of each unit frame, establish the mapping relationship data between the light-emitting bands corresponding to the number of rows and the number of columns of the light-emitting bands and the light-emitting bands of each unit frame, and then realize the division of the whole unit frame, and obtain the related data of each unit frame. According to the mode, the display picture is divided, the quantity of the luminous lamp strips and the lamp beads can be changed, the pixel density of each unit picture can be flexibly adjusted, the controller can adaptively set the pixel density of the unit picture according to the increase and decrease of the luminous lamp strips in the atmosphere lamp, the playing logic of the lamp effect is kept unchanged, and the lamp effect of the atmosphere lamp can be normally played even after the luminous lamp strips are increased or decreased.
In another embodiment, the number of light-emitting light bands covered by each unit picture and the number of light beads covered by each unit picture, for example, 4*4 light beads, can be specified in the picture configuration information to form one unit picture. When the luminous lamp strip of the atmosphere lamp is increased or decreased, the unit picture is divided for the next time, the whole display picture is divided according to the established specification, so that the number of the luminous lamp strips and the lamp beads correspondingly covered by each unit picture is unchanged, the unit picture is increased or decreased along with the increase or decrease of the luminous lamp strip, the density of the unit picture in the whole display picture is increased or decreased, and the normal operation of the lamp effect playing of the atmosphere lamp can be ensured.
Since the atmosphere lamp is configured according to the number and layout of the assembled light-emitting lamp strips, although fig. 2 and 3 show that the unit frames in different states of the curtain lamp or the frame lamp can be set in the same division manner, considering that the frame lamp can be linearly treated based on a single light-emitting lamp strip, the unit frames can be divided in a simpler manner, for example, in the case that the controller grasps the total number of lamp beads and the frame configuration information, the total number of lamp beads is simply divided equally, and the unit frames in two directions are rapidly determined. For example, a simple implementation is to divide the whole display frame into four unit frames of upper left, upper right, lower left and lower right, and correspondingly, the environment reference image can be set to four area images later, which is fast and efficient.
Of course, the frame configuration information may also have other defining manners, for example, the frame configuration information may be set according to a rule of partitioning an environmental reference image serving as a target picture, in any case, the rule of partitioning the display frame of the atmosphere lamp is generally consistent with the rule of partitioning the target picture, so as to ensure that each color taking image determined after partitioning in the target picture has one unit frame corresponding to one unit frame in the atmosphere lamp, so that the projection of a color value corresponding to each color taking image to a corresponding unit frame in the atmosphere lamp is conveniently realized, and the overall light atmosphere effect of projecting the target picture on the whole display frame of the atmosphere lamp is realized.
According to the product architecture and the working principle of the atmosphere lamp device, the color taking method of the atmosphere lamp device can be realized as a computer program product, the computer program product is stored in a memory of a control chip in a controller of the atmosphere lamp device, a central processing unit in the control chip is invoked from the memory and then runs, and the environment acquired by an image acquisition interface is used for controlling the atmosphere lamp to play corresponding lamp effects according to the environment reference image acquired by the image acquisition interface.
Referring to fig. 4, in one embodiment, the color extraction method of the atmosphere lamp device of the present application is mainly implemented on a controller side of the atmosphere lamp device, and is executed by a control chip of the controller, and includes:
step S5100, surrounding four sides of the target picture to divide the target picture into a plurality of color taking areas, and determining a corresponding area image of each color taking area;
in the working process of the controller, environment reference images can be continuously obtained through the image acquisition interface according to fixed time intervals, wherein the environment reference images can be interface images or live-action images, and the interface images are images generated by a graphical user interface of the terminal equipment. The environment reference image is used for determining the lighting effect to be played by the atmosphere lamp and generating a corresponding lighting effect playing instruction, and controlling the atmosphere lamp to play the corresponding lighting effect. When one environment reference image is used as a target image to generate and play a corresponding light effect, the next environment reference image can be continuously acquired as a new target image to generate and play a new light effect corresponding to a frame, so that the light effect process played by the whole atmosphere lamp is basically synchronous with the change process of the image flow acquired by the image acquisition interface, and the effect of expanding and projecting the ambient light atmosphere of an external image into the atmosphere lamp is achieved.
In one embodiment, if the acquired environmental reference image is not bitmap data, it may be converted to bitmap data first for pixel-based operations.
In one embodiment, after the controller obtains an environmental reference image as the target picture, if the target picture is oversized, the target picture can be compressed to a specified size specification, so as to reduce the operation amount of the control chip of the controller. For example, when the environment reference image is an interface image, the display resolution of various conventional terminal devices is generally above 1920×1080, and the pixel density formed by the light emitting units of the atmosphere lamp is actually far lower than the size of the interface image, so that the target image can be compressed to a desired size, for example, to a resolution of 320×180, and then when the target image is invoked, the image obtained after compression is invoked.
After the target picture is determined, the target picture can be subjected to region segmentation, and a segmentation rule corresponding to the region segmentation can be described in the frame configuration information so as to be directly invoked. Since the frame configuration information is used for describing the partition rule corresponding to the region division, and the purpose of the partition rule is to divide the display frame of the target picture and the atmosphere lamp into a plurality of regions for color extraction, that is, into a plurality of color extraction regions, how many color extraction regions are respectively on the four sides of the target picture can be defined in the frame configuration information through the description of the partition rule, and of course, the definition is usually performed according to the opposite sides of the target picture. According to the various conditions disclosed in the foregoing, such as a nine-square grid and a 16-square grid, the four sides of the target picture are all subjected to region segmentation, so that a plurality of color taking regions can be segmented on each outer side of the target picture. The image content covered by each color-taking area is the corresponding area image.
In this application, when the target picture is divided into regions, as shown in fig. 5, attention is paid to the dividing effect of the outer side of the target picture, a plurality of color taking regions are required on each outer side, and the respective outer sides are generally equally divided according to the lengths thereof, so that the central regions included in the color taking regions on the respective outer sides may be unified into the same central region, or may be divided into a plurality of central unit regions according to the equally dividing specification of the respective sides, in any case, the relevant region located in the center of the image of the target picture, typically the presentation region of the main image content, will not have a black band, and the image content will not be interfered by the boundary black pixels due to external factors, so that the black side content may be excluded from the target picture. Therefore, when the target picture is segmented, the segmentation effect of four sides of the target picture is focused, but the effect of a central region of the target picture is not focused, the target picture has pertinence, the calculated amount when the black band of the target picture is determined can be reduced, and the calculation overhead of a control chip of the controller is saved.
In general, in order to facilitate subsequent multi-thread concurrent detection of each color taking area to determine a black area thereof, when the target picture is segmented, the equally-dividing rule cutting is adopted to ensure that each color taking area obtained by segmentation on four sides of the target picture has the same size specification, and the same length and width are also available among the color taking areas, so that a standardized thread can be adopted to determine the black area for each color taking area.
Step S5200, detecting black areas belonging to the outer side of the target picture in each area image, and determining an edge black band corresponding to the outer side of the target according to the black areas of a plurality of area images belonging to the same outer side of the target;
for each outer side, it is necessary to determine its edge black band so as to further exclude pixels belonging to the edge black band in the respective area images sitting on the outer side, and the remaining image content is taken as a color-taking image effectively representing the dominant color tone.
It is desirable to determine that the outer side of the edge black band is considered the target outer side. The determination of the outer side of the target can be set according to actual conditions. For example, when the environment reference image employed by the target picture is an interface image of the terminal device, since pictures such as video, game, etc. may differ in frame from the graphical user interface of the terminal device, for example, at 16:9, display 21 in the graphical user interface corresponding to the display of fig. 9: 9, an edge black band may appear at the top and bottom edges of the picture, as shown in fig. 5. In this case, the target outer side may be determined as the top and bottom sides of the target picture. Similarly, the target outer side edge can also be a single outer side edge, and all four outer side edges can also be used, and the specific requirement can be determined according to actual needs. For each item of label outer side, the corresponding edge black band can be determined by adopting the service logic which is completely the same in the step.
When the edge black band of the outer side edge of the target is determined, the black area of each area image is determined, and then the corresponding edge black band is determined based on the black areas of all the area images of the outer side edge of the target.
When detecting the black region of each region image, the recognition efficiency can be improved by setting the recognition start point. Specifically, whether each pixel belongs to a color value representing a black color can be identified from one corner point of the outermost edge of the region image on the outer side of the object, and a set of pixels of the outermost edge including the corner point on the outer side of the object is taken as a base line, so that a plurality of lines belonging to the full black color are continuously determined inwards, and a black region in the region image can be determined. It will be appreciated that this black area is also the area of the area image that is closest to the corresponding outer side of the object.
When the black area is detected by taking the area image as a unit, the total number of pixels of the single area image is less, so that the consumption of calculation power is extremely low and the occupation of the buffer space is also low for identifying the black area of the whole area image, the requirement on the performance of the control chip can be reduced, and the realization cost of atmosphere lamp equipment can be saved.
In one embodiment, on the premise that the control chip of the controller supports multi-thread operation, the process of identifying the black area of the single area image is realized as an instruction set of a single thread in advance, so that for each target outer side edge of the black area in the area image to be determined, corresponding threads are created for each area image of each target outer side edge, and the threads are operated concurrently, so that the time for identifying the black area of all the area images can be obviously shortened, and the speed of the control chip for determining the edge black band corresponding to the target outer side edge is improved.
After each region image of the object's outer side has its black region determined, each black region may be represented in a corresponding coordinate form, for example, in a window form (x 0, y0, x1, y 1) including coordinates of the upper left and lower right corners determined with respect to the coordinate system of the object image, where (x 0, y 0) represents coordinates of the upper left corner pixel and (x 1, y 1) represents coordinates of the lower right corner pixel. For ease of understanding, the direction along the outer side of the object is defined as the longitudinal direction of the black region, and the direction perpendicular to the longitudinal direction is defined as the width direction. Accordingly, for the same target outer side, determination of the edge black band can be achieved as long as the width of the edge black band thereof is determined in accordance with the black region having the smallest width therein, and the length of the target outer side is determined as the length of the edge black band. Meanwhile, the edge black band may also be expressed in a window form to designate coordinates of pixels in the upper left and lower right corners thereof.
In one embodiment, the edge black bands of the outer sides of the respective targets determined based on the target pictures may be stored as historical edge black bands for use along with the next target picture. When a condition is met, such as detecting that the next target picture does not match the historical edge black band, a new edge black band may be generated based again on the next target picture correspondence. In addition, according to the mode of timing triggering, a certain historical edge black band which is determined based on a certain target picture is used for a period of time, and when the timing arrives, a new edge black band is correspondingly generated according to the target picture which is being processed when the timing arrives, and is used as a historical edge black band for subsequent use. The above modes can further save the system overhead of the control chip and improve the system operation efficiency.
Step S5300, determining the image content outside the corresponding edge black band in each area image belonging to the same target outer side as a color taking image corresponding to the area image;
when the corresponding edge black band is determined on any one of the outer side edges of the object, the edge black band can be adopted to cut off the image content which is overlapped with the edge black band in each area image on the outer side edge of the object, and the image content which is not overlapped with the edge black band becomes effective image content and can become a color taking image corresponding to the area image.
In one embodiment, for each area image, the area image may be cut according to the edge black band, so that the variable representing the area image does not save the color value of the pixel belonging to the edge black band portion in the area image, and the data volume of the variable is reduced, so as to further reduce the occupation of the memory.
It will be appreciated that the edge black band of each object outer side is essentially a black region spanning the entire object outer side of the object picture and is rectangular, and that even if a black pixel appears in a portion of the object picture outside the rectangle, it will not be recognized as an invalid pixel, but will be an effective pixel, constituting one pixel in the image content of the corresponding color-taken image. Therefore, after the edge black band is accurately determined, whether the black pixels in the target picture are effective pixels in the color-taking image can be effectively distinguished, and the image content in the color-taking image can be ensured to be correctly understood and transmitted to the original picture.
Step S5400 sets a light emission color value of a light emitting unit in a unit frame corresponding to each color taking image in the atmosphere lamp device according to the color taking image.
After each target outer side edge determines the color taking image corresponding to each area image in the above manner, the light emitting color value of each light emitting unit in the unit picture corresponding to the color taking image in the atmosphere lamp can be determined based on each color taking image.
When the luminous color value of the lamp bead in each unit picture is determined, the luminous color value is generated by referring to the dominant color tone of the color taking image mapped corresponding to the unit picture, so that each unit picture can generally reflect the dominant color tone formed by the corresponding color taking image, projection of the light atmosphere of the whole picture of the whole environment reference image on the display picture of the whole atmosphere lamp is realized, and the lamp efficiency of the atmosphere lamp play plays a role in effectively expanding the light atmosphere of the environment reference image.
It should be noted that, in the case where the atmosphere lamp is a frame lamp, since the lamp beads are not disposed at the central portion in the display frame, it is unnecessary to consider the correspondence between each unit frame of the central portion and the area image, or it is sufficient to combine each area image of the central portion into the area image near the side of the whole environment reference image, and together with the area image of the side, a mapping relationship is established between the unit frames corresponding to the area image of the side.
After the color value of the main tone of a color taking image is determined, the color value of each light emitting unit, namely, each lamp bead, covered by the unit frame corresponding to the color taking image can be further determined, and the projection of the lamp bead lighting effect on the main tone of the area image can be realized as long as the setting of the light emitting color value of each lamp bead in the unit frame is always related to the light emitting color value of the main tone, and the setting is ensured to be maintained in a reasonable variation range relative to the light emitting color value of the main tone, so that the atmosphere expansion effect is realized.
In determining the dominant hue of the area image, the dominant color constituting the area image may be determined as the dominant hue of the area image. In one embodiment, the color tone analysis may be performed on the area image first, a plurality of color tone intervals may be determined, the total number of pixels in each color tone interval may be identified, the pixel corresponding to the color tone interval with the largest total number of pixels may be determined as the main pixel reflecting the main color tone, and the main color tone may be determined according to the color values of the main pixels. In another embodiment, since the highlight and the dark portions in the image are not generally regarded as hues expressed by the image, the pixels corresponding to the dark portions and/or the highlight in the area image may be filtered in advance, and then a dominant hue, which is the dominant hue of the area image, may be determined according to the color values of the remaining pixels. According to the dominant hues determined in these ways, it is possible to determine a significant color in a complex hue distribution, and the determined dominant hue more reflects the dominant color to be expressed by the corresponding area image, thereby more accurately reflecting the light effect of the area image. The main tone determining mode according to the pixels can be determined by taking the average value of the color values of all relevant target pixels, or can be determined by firstly removing the extreme value of the preset range from the color values of all the target pixels and then taking the average value, and the final determined average value can be used as the luminous color value of the regional image.
After the light-emitting color values corresponding to the lamp beads in each unit picture in the whole display picture of the atmosphere lamp are determined in the mode, according to the default service logic, a light effect playing instruction corresponding to the environment reference image of the current frame is generated according to the light-emitting color values corresponding to each lamp bead and the position information thereof, the light effect playing instruction is transmitted to each light-emitting lamp strip of the atmosphere lamp, the control chip of each lamp bead in each light-emitting lamp strip analyzes and extracts the light-emitting color value corresponding to the lamp bead according to the serial communication protocol, and the corresponding light-emitting element is controlled to emit light of the corresponding color, so that the light effect collaborative playing of the current frame is realized.
From the above embodiments, it is appreciated that the present application has various advantages over the prior art, including but not limited to:
first, this application can realize the accurate color taking of subregion: according to the method, four sides of a target picture are surrounded, a plurality of color taking areas are determined for each outer side edge to obtain corresponding area images, black areas in each area image are resolved, then the black areas in all area images of the same target outer side edge are utilized to jointly determine edge black bands of the target outer side edge, black areas of the corresponding area images on the target outer side edge are eliminated according to the edge black bands, corresponding color taking images of all area images are obtained, luminous color values of light effects are determined according to the color taking images, therefore control granularity of identifying the edge black bands is specific to the area image level, black edges in the target picture are accurately and effectively removed, color taking is carried out on all color taking images on the basis of eliminating black, vividness represented by the luminous color values determined by the light effects can be improved, overall brightness of the light effects is improved, and light effect quality is better.
Second, the present application can be implemented with lower computational effort: according to the method, hardware conditions of the atmosphere lamp device are fully considered in each link of eliminating black edges in the color taking process, for example, when edge black bands are identified, granularity of an area image is specified, black areas are identified by a small image, the edge black bands on the outer side of the target are quickly determined by fusing all black areas on the outer side of the same target, compared with the whole image traversing determining of the edge black bands, the calculation force requirement is lower, the execution efficiency is higher, the identification process is quicker, therefore, a control chip of the atmosphere lamp device can still work more efficiently and smoothly even if the calculation force is relatively limited, color taking is quickly realized, and smooth playing of the lamp effect is ensured.
In addition, this application can promote the image quality of lamp effect: the application can realize accurate color taking of subregion, can high-efficient operation in atmosphere lamp equipment moreover, further still refine the color taking granularity of lamp effect in order to correspond each unit picture in the lamp effect, so, according to the lamp effect that this application was broadcast, its image quality is fine and smooth, and the color is bright and clear, and whole image quality promotes obviously.
On the basis of any embodiment of the method of the present application, please refer to fig. 6, which is a block diagram of a target picture, the block diagram is divided around four sides of the target picture, a plurality of color taking areas are obtained by dividing each outer side, and a corresponding area image of each color taking area is determined, including:
Step S5110, obtaining size information of a target picture, wherein the size information comprises a transverse size and a longitudinal size;
the size information of the target picture can be determined by calculating the number of pixels thereof in the length direction and the width direction. When the controller obtains a target picture through the image acquisition interface or compresses the target picture to a preset size specification, the controller can obtain corresponding transverse dimensions and longitudinal dimensions according to the corresponding length direction and width direction of the image content of the target picture.
Step S5120, dividing the transverse dimension and the longitudinal dimension equally according to the preset number, and correspondingly determining the transverse unit dimension and the longitudinal unit dimension to form region definition information;
the number of partitions corresponding to the horizontal and vertical directions of the target picture can be preset in the picture configuration information, and then the horizontal size and the vertical size are divided by the corresponding number of partitions respectively, so that the corresponding horizontal unit size and the corresponding vertical unit size can be determined, and the two actually and jointly define a color taking frame with standardized specification to form the region definition information.
Step S5130, according to the transverse unit size and the longitudinal unit size in the region definition information, carrying out region segmentation around the four sides of the target picture to obtain corresponding color taking regions and segmented residual central regions;
After the color taking frames of each standardized specification are obtained, any corner point of the target picture is taken as a starting point, and coordinates of window forms corresponding to each color taking region on each outer side edge of the target picture are calculated, so that surrounding type region segmentation is carried out on four sides of the target picture, each color taking region is obtained, and the rest central region is segmented, wherein the central region is generally formed by one or more regions corresponding to the color taking frames and can be treated as the same region.
Step S5140, performing corresponding subsequent processing on the image content of each color capturing area as a corresponding area image, and directly performing corresponding subsequent processing on the image content in the central area as a color capturing image.
For each determined color taking area on each outer side, the image content in the determined color taking area is a corresponding area image, and for each area image, the subsequent processing is still performed according to the processes disclosed in the embodiments of the present application, for example, step S5200 to step S5400, so that the color taking image can be determined by removing the corresponding edge black band according to each area image, and then the luminous color value of the unit frame in the atmosphere lamp corresponding to the color taking image is determined.
For the image content of the central area, since the black edge processing is not required, flexible processing can be performed according to different states of the atmosphere lamp, for example, in one embodiment, the atmosphere lamp is a frame lamp, the image content of the central area may not be required to be processed, in another embodiment, the atmosphere lamp is a curtain lamp, the image content of the central area may be directly used as a color-taking image, and step S5400 is directly performed to determine the light-emitting color value of the corresponding unit picture.
According to the above embodiment, when the target picture is segmented, the region segmentation of the target picture can be rapidly completed according to the number of the partitions set in the transverse direction and the longitudinal direction of the target picture, so that each region image can be rapidly obtained, the operation amount is extremely low, the efficiency is extremely high, the identification processing of the image content in the central region of the target picture is omitted in the middle process, the consumption of the calculation force of the control chip is saved, the calculation efficiency can be optimized, and the effect is more obvious for the embedded chip with limited calculation force.
On the basis of any embodiment of the method of the present application, referring to fig. 7, detecting a black region belonging to a target outer side of the target picture in each region image, determining an edge black band corresponding to the target outer side according to the black regions of a plurality of region images belonging to the same target outer side, including:
Step S5210, determining two opposite outer sides in the target picture as target outer sides, where the two outer sides are a top side and a bottom side in a display frame of the target picture;
in this embodiment, the key service is to set two opposite outer sides, namely, the top side and the bottom side, of the target picture as the outer sides of the target that need to be subjected to black-edge processing, so as to simplify the complexity of removing black-edge interference processing on the target picture.
Step S5220, for each region image in the target outer side, performing pixel traversal from a pixel of an outer corner at the outermost side of the region image, and identifying a black pixel connected domain formed by extending and expanding the width inwards by taking the outermost side as a base line in the region image;
when it is necessary to determine a black region for each region image on the outer side of the object, the entire object region image may be traversed from a pixel belonging to an outermost one of the outer side points of the object in the object region image, and a black pixel connected region formed by extending an extended width toward the inner side opposite to the outer side of the object with the outermost pixel set including the outermost corner point as a base line in the region image is identified.
Specifically, in the traversal process, for each pixel, it is determined whether the color value of the pixel belongs to a color value representing a black phase, when the color value belongs to the color value, the pixel is regarded as an effective pixel of the black pixel connected domain, otherwise, the color value does not belong to the range of the black pixel connected domain, and so on until the whole black pixel connected domain is determined. As one member constituting the effective black pixel connected region, it is possible to set the entire pixel set of the base line of the entire area image in the longitudinal direction on the outer side of the object to be black pixels, and such base line can constitute the inner range of the black pixel connected region. As for determining whether a pixel belongs to a black pixel, that is, whether it belongs to a color value representing a black phase, it may be set with reference to a standard black color value RGB (0, 0), and a certain tolerance range may be allowed, for example, when the color value of a pixel is not more than RGB (10, 10, 10), it may be recognized as a black pixel so as to fall within the range of the corresponding black pixel connected domain.
In one embodiment, the tolerance range may be further enlarged, and specifically, when the number of pixels not belonging to the black pixel in the same baseline does not exceed the set proportion of the total number of all pixels in the whole baseline, the baseline is still considered as being within the range of the black pixel connected domain. For example, when the set comparison is 0.1, when the base line has 20 pixels, even if the color value of 2 pixels in the base line belongs to the color value corresponding to red, the base is regarded as being within the black pixel connected domain.
By setting the corresponding tolerance ranges based on the pixel color values and/or based on the number of non-black pixels in the baseline, the tolerance of the edge black band to various possible color changes of the individual pixels can be further expanded, and the situation that the pixels which should belong to the edge black band are mistaken for effective pixels of the color taking image is avoided.
Step S5230, using the minimum width in the black pixel connected domain of each area image belonging to the same object outer side as the width of the edge black stripe of the object outer side, using the length of the object outer side as the length of the edge black stripe, and generating the coordinate information of the edge black stripe corresponding to the object picture.
After each region image of each target outer side edge determines its corresponding black pixel connected region, that is, its black region, it is easy to understand that, for each black region located on the same target outer side edge, the black region with the smallest width generally represents a real edge black band, accordingly, the minimum width in the black region of each region image can be taken, and the minimum width is taken as the width corresponding to the edge black phase corresponding to the target outer side edge, the original length of the target outer side edge is determined as the length of the edge black band, and the coordinate information of the edge black band is determined according to the width and the length, specifically, the upper left corner pixel and the lower right corner pixel of the coordinate system of the edge black band relative to the target picture can be expressed in a window form, and a rectangular window can be defined.
According to the above embodiment, when determining the edge black stripe on the outer side of the target, only the black pixel connected domain of each region image on the outer side of the target is determined by taking the region image as a unit, and the edge black stripe on the outer side of the target can be determined by combining the minimum width of the black pixel connected domain of each region image and the length of the outer side of the target, so that the computing amount is extremely small, the computing effect is extremely high, the edge black stripe is prevented from being identified by performing full-image traversal on the target picture, and the method is particularly suitable for being implemented in an embedded chip, can ensure that the effect of rapidly determining the edge black stripe can be still exerted in a control chip with limited computing power, and has a basic effect on the efficient operation of the whole atmosphere lamp equipment.
On the basis of any embodiment of the method of the present application, determining, as a color-taking image corresponding to each region image belonging to the same target outer side, image content located outside the corresponding edge black band, including:
step S5310, for each region image of any outer side of the target, determining whether the pixel belongs to an edge black band according to the coordinate information of each pixel in the image content of the region image and the coordinate information of the edge black band of the outer side of the target;
Step S5320, all pixels in each area image which do not belong to the edge black band are configured as a color-taking image corresponding to the area image.
On the basis of determining the coordinate information of the edge black band on the outer side of each object, for example, the edge black band shown in the window form and shown in the foregoing, the speed of constructing the color-taking image corresponding to each area image can be improved based on the coordinate information.
Specifically, for each region image of the outer side of the object, the coordinate information of each pixel in the region image is compared with the coordinate information of the edge black band corresponding to the outer side of the object, so as to determine whether the pixel belongs to the edge black band. For example, if a pixel in the area image is (x 2, y 2) and the coordinate information of the corresponding edge black stripe is (x 0, y0, x1, y 1), if x2> x1, it can be determined that the pixel is beyond the range of the edge black stripe, that is, the effective pixel belonging to the color-taking image, so that the area image can be fully identified, and all the pixels in the image content of the color-taking image can be quickly determined, thereby constructing the corresponding color-taking image.
According to the above embodiment, whether the pixels in the area image belong to the range of the color capturing image can be quickly identified based on the coordinate information of the edge black band, so that all the pixels corresponding to the color capturing image can be quickly identified, the color capturing image can be quickly constructed, and the processing efficiency can be further improved.
On the basis of any embodiment of the method of the present application, referring to fig. 8, setting, according to each color capturing image, a light emitting color value of a light emitting unit in a unit frame corresponding to the color capturing image in an atmosphere lamp device includes:
step S4100, distinguishing dark pixels and non-dark pixels in the color-taking image according to a preset dark threshold;
the controller may read a preset darkness threshold value, so as to distinguish between a darkness pixel and a non-darkness pixel in each of the captured images, for example, in a scene where the pixel color value is usually represented by RGB, the value of a solid black pixel may be represented by RGB (0, 0), the darkness threshold value may be set to be a neighboring value near the value of the solid black pixel, for example, RGB (10,10,10), so that a pixel in the captured image below the darkness threshold value is a darkness pixel, and a pixel above the darkness threshold value is a non-darkness pixel, thereby implementing the distinction.
In one embodiment, the process of distinguishing individual pixels in a color-taking image may be implemented as follows, including the steps of:
firstly, traversing the luminous color values of all pixels of the color taking image, and marking the pixels with the luminous color values lower than a preset dark light threshold value as dark light pixels according to a preset proportion threshold value;
The color-taken image is bitmap data, and thus, each pixel thereof can be traversed row by row and column by column in accordance with its resolution size, and the light emission color value of each traversed pixel is compared with the darkness threshold value, thereby discriminating whether or not each pixel is a darkness pixel.
In order to avoid the situation that some dark-tone pixels are erroneously determined to be non-dark-tone pixels, so that when a color-taking image has a larger range of dark colors, the finally obtained non-dark-tone pixels are few and are unfavorable for accurately calibrating the main tone, in the embodiment, a preset proportional threshold is adopted for controlling the total quantity of the dark-tone pixels, so that the total quantity of the non-dark-tone pixels is ensured to be not lower than a certain amplitude.
For example, it is possible to set the dark pixel to occupy the total amount of all pixels in the color image not higher than the ratio threshold value, for example, 90%, in which case the pixel whose color value is lower than the priority mark pixel is the dark pixel, and in the case where the total amount of the dark pixels is maintained not to exceed 90%, even if the color value of the individual pixel is lower than the dark threshold value, it is higher than the color values of the other dark pixels, whereby the pixel can be recognized as a non-dark pixel. After a pixel is determined to belong to a dark pixel, a corresponding mark is given, so that the dark pixel is filtered for the color-taking image.
Then, the other pixels except the dark pixels in the color-taking image are marked as non-dark pixels: similarly, for other pixels in the color-taking image that do not belong to dark pixels, the other pixels can be marked as non-dark pixels directly.
In the above embodiment, the proportion threshold is set to ensure that the proportion of the dark light pixels of the filtered color-taking image does not exceed the predetermined proportion threshold, so as to ensure that enough non-dark light pixels are used for determining the dominant color of the color-taking image, and for some low-light image areas, the dominant color can still be accurately identified, so that the corresponding unit frames can obtain the effective luminous color values corresponding to the light effects.
Step S4200, determining a luminous color value corresponding to a dominant hue of the color-capturing image according to a color value of a non-darkness pixel of the color-capturing image;
the color value of the luminescence of the non-dark pixels stored in the color-capturing image can be used as a main basis for determining the dominant color of the color-capturing image, and thus, various embodiments can be evolved, for example:
in one embodiment, the luminance color values of all the non-dark pixels in the color-extracted image are averaged, and the average value is used as the luminance color value corresponding to the dominant color. The dominant hue determined by the corresponding color-taking image is thus the base hue that is averaged to reflect the aggregate appearance of the non-darkish pixels of the color-taking image.
In another embodiment, the light-emitting color values of all the non-dark pixels and the light-emitting colors of all the dark pixels in the color-taking image are respectively averaged, and the two average values are weighted according to a preset weight to obtain an average value as the light-emitting color value corresponding to the dominant hue. The method can avoid the influence of transition filtering of the dark light region on the dominant hue of the color-taking image, of course, when the weights of the two average values are set, the corresponding weight of the non-dark light pixels can be amplified, the weight of the dark light pixels can be reduced, for example, the weight of the average value of the luminous color values of the non-dark light pixels and the average value of the luminous color values of the dark light pixels is set to be 0.9:0.1, so that the dominant hue of the color-taking image is still determined by taking the luminous color values of the non-dark light pixels as a main body, but the functions of the non-dark light pixels are also related and considered, and the finally determined luminous color values of the dominant hue are more balanced.
Step S4300, generating the luminous color value of each luminous unit in the unit frame corresponding to the color drawing image in the atmosphere lamp device according to the luminous color value corresponding to the main tone.
When a color capturing image determines a light emitting color value corresponding to a corresponding dominant color, the light emitting color value of each lamp bead covered by the unit frame corresponding to the color capturing image can be generated accordingly, and the specific implementation manner can be flexibly set, for example:
In one embodiment, the light-emitting color value corresponding to the dominant color is set to the light-emitting color value of each lamp bead covered by the unit frame corresponding to the color-taking image. That is, the light emission color values of the respective beads corresponding to the unit frames corresponding to the color-taking image are set to coincide with the light emission color values of the main color tone of the color-taking image, and the respective beads emit light in accordance with the same light emission color values. When the corresponding light effect is played in the mode, the light effect presented by the whole display picture of the atmosphere lamp corresponds to each color-taking image and the color of the color-taking image is distinct.
In another embodiment, according to the light-emitting color value of the main tone adopted by another unit frame adjacent to the current unit frame corresponding to the color-taking image, the light-emitting color value of each lamp bead covered by the current unit frame is subjected to gradual change adjustment, so that a gradient change relation of the light-emitting color value is formed among each lamp bead in the current unit frame along the direction towards the other unit frame. That is, the luminous color value of the lamp bead of one unit picture maintains the luminous color value corresponding to the dominant color from the center position of the unit picture, then refers to the color value of another unit picture along the way of facing the adjacent unit picture, on the basis of the luminous color value corresponding to the dominant color, the luminous color value of each lamp bead in the direction is gradually changed according to a certain gradient to adjust the certain gradient value, and finally, when the corresponding lamp effect is played, the luminous effect of color transition is generated from one dominant color of one unit picture to another dominant color corresponding to the adjacent unit picture. According to the method, the luminous color values of the lamp beads correspondingly covered by each unit picture are set, so that the lamp bead luminous effect of each unit picture presents a gradual change effect of relatively soft transition, and the lamp effect presented by the whole display picture of the atmosphere lamp is softer and finer overall.
According to the above embodiment, the dark light pixels and the non-dark light pixels in the color taking image are distinguished, the light color value of the main color tone of the color taking image is determined by taking the light color value of the non-dark light pixels as the main body, and on the basis, the light color value of each lamp bead in the unit frame corresponding to the color taking image is determined, so that the light color value of each lamp bead always reflects the main color tone of the color taking image, the effective projection of the main color tone of the corresponding color taking image is realized, and the lamp efficiency played by the atmosphere lamp is ensured to effectively reproduce and expand the atmosphere effect of the environment shot by the camera.
On the basis of any embodiment of the method of the present application, referring to fig. 9, before the region segmentation is performed around four sides of the target picture, the method includes:
step S3100, continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures;
in this embodiment, the controller continuously collects interface images in external terminal devices through the image acquisition interface as environment reference images, and usually collects the interface images according to a preset frame rate, for example, 30 frames per second, so that a control chip of the controller actually needs to process an image stream formed by each frame of interface images, and generates a corresponding light effect for each frame of interface images in the image stream.
Step S3200, calculating the average color value of pixels at each history edge black band corresponding to the target picture according to the history edge black band predetermined corresponding to the outer side of the target;
the controller can process one interface image in the process of playing the light effect, and the edge black band obtained by determining the interface image as the target image is used as the historical edge black band for the other interface images after the interface image, so that the other interface images do not need to execute all relevant steps related to determining the edge black band in the method, and the system overhead is saved.
However, the frame of the interface image may also change, and for this case, it needs to be quickly identified with a small amount of calculation, so as to adjust the edge black band used to determine the color-taking image in time, that is, to identify whether the currently-used historical edge black band is still suitable for the current and subsequent target pictures in time.
Accordingly, according to the historical edge black bands corresponding to the outer side of the target, the average color value of each pixel of the current target picture falling into the historical edge black bands can be calculated to be used for measuring whether the historical edge black bands have regional variation or not.
Step S3300, when the average color value belongs to a color value representing a black phase, performing a subsequent process based on the history edge black band to determine the light-emitting color value for the target picture;
when the average color value corresponding to the historical edge black band still belongs to the color value representing the black phase, it indicates that the current used historical edge black band still is applicable, even if the width of the corresponding edge black band of the current target picture is actually enlarged, the effect is not great, in this case, the service step of determining the edge black band may not need to be re-executed for the current target picture, and only steps S5300 and S5400 need to be directly executed, the corresponding color-taking image is determined according to the historical edge black band, and the corresponding light-emitting color value is generated according to the color-taking image.
And S3400, when the average color value does not belong to the color value corresponding to the representation black phase, executing subsequent processing to determine the latest edge black band based on the target picture, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the interface image acquired later.
When the average color value does not belong to the color value corresponding to the representing black, it indicates that the edge black band of the target picture may change, and a part of the colored pixels fall within the area of the original historical edge black band, in this case, the process of implementing color extraction based on the target picture needs to be completely executed, for example, the whole process from step S5100 to step S5400 is executed, so as to determine the edge black band corresponding to the target picture as the latest edge black band, determine the color extraction image according to the latest edge black band, generate the corresponding luminous color value according to the color extraction image, and so on.
Similarly, in order to continue to maintain the optimization effect of the operation efficiency, the latest edge black band determined for the current target picture is further replaced with the previously stored historical edge black band, so that the latest edge black band becomes a new historical edge black band, and the subsequent interface image is served.
According to the embodiment, in the process of continuously collecting the interface image playing light effects, the atmosphere light device can timely identify the edge black band variation caused by the frame variation of the interface image and timely update the edge black band on the basis of keeping not frequently calculating the edge black band for each interface image, so that the color taking images can be adaptively adjusted according to the frame of the interface image, the light atmosphere effects of the interface image can be restored with high quality according to the light effects of the color taking images.
On the basis of any embodiment of the method, before the surrounding target picture is subjected to region segmentation, the method comprises the following steps:
step S2100, continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures;
in this embodiment, the controller continuously collects interface images in external terminal devices through the image acquisition interface as environment reference images, and usually collects the interface images according to a preset frame rate, for example, 30 frames per second, so that a control chip of the controller actually needs to process an image stream formed by each frame of interface images, and generates a corresponding light effect for each frame of interface images in the image stream.
Step S2200, when the trigger timing reaches an event, performing subsequent processing based on a history edge black band predetermined by the corresponding target outer side edge to determine the luminous color value for the target picture;
the controller presets a timer, sets a corresponding timing period, triggers a corresponding timing arrival event when the timing period arrives, drives to generate a corresponding edge black band again for the target picture at the current moment through the timing arrival event, and stores the edge black band as a historical edge black band in a replacement mode, so that the edge black band used as each processed subsequent interface image does not need to be recalculated in the next timing period at the moment.
Accordingly, when the timer does not trigger the timing arrival event, for the current target picture, step S5300 and step S5400 may be performed on the current target picture based on the historical edge black band predetermined corresponding to the outer side of the target, that is, the edge black band determined corresponding to the previous target picture, to determine the color-taking image thereof, and then generate the corresponding light-emitting color value according to the color-taking image.
And step 2300, when the trigger timing reaches an event, executing subsequent processing based on the target picture to determine a latest edge black band, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the subsequently acquired interface image.
When the trigger timing reaches the event, the whole process from step S5100 to step S5400 is performed for the target picture in response to the event, so as to determine the edge black band corresponding to the target picture, take it as the latest edge black band, determine the color taking image according to the latest edge black band, generate the corresponding luminous color value according to the color taking image, and so on.
Similarly, in order to continue to maintain the optimization effect of the operation efficiency, the latest edge black band determined for the current target picture is further replaced with the previously stored historical edge black band, so that the latest edge black band becomes a new historical edge black band, and the subsequent interface image is served.
Similarly, according to the above embodiments, according to the atmosphere lamp device of the present application, in the process of continuously collecting the interface image playing light effects, on the basis of keeping not frequently calculating the edge black band for each interface image, the edge black band can also be timely updated in response to the edge black band variation caused by the frame variation of the interface image, so as to realize updating the edge black band according to the timing to adjust the color-taking images, so that the light atmosphere effects of the interface images can be restored with high quality according to the light effects played by the color-taking images.
Referring to fig. 10, another embodiment of the present application further provides a color extraction device for an atmosphere lamp device, which includes a region segmentation module 5100, a black band detection module 5200, an image determination module 5300, and a color setting module 5400, where the region segmentation module 5100 is configured to segment a target picture around four sides thereof, to obtain a plurality of color extraction regions on each outer side, and to determine a region image corresponding to each color extraction region; the black band detection module 5200 is configured to detect a black area belonging to a target outer side of the target picture in each area image, and determine an edge black band corresponding to the target outer side according to the black areas of the plurality of area images belonging to the same target outer side; the image determining module 5300 is configured to determine, as a color-taking image corresponding to each region image belonging to the same target outer edge, an image content located outside the corresponding edge black band; the color setting module 5400 is configured to set, according to each color taking image, a light emission color value of a light emitting unit in a unit frame corresponding to the color taking image in the atmosphere lamp device.
On the basis of any embodiment of the apparatus of the present application, the region segmentation module 5100 includes: a size acquisition unit configured to acquire size information of a target picture, the size information including a lateral size and a longitudinal size; the area definition unit is used for dividing the transverse size and the longitudinal size equally according to the preset number, and correspondingly determining the transverse unit size and the longitudinal unit size to form area definition information; the segmentation execution unit is arranged to carry out region segmentation around four sides of the target picture according to the transverse unit size and the longitudinal unit size in the region definition information, so as to obtain corresponding color taking regions and a central region for residual segmentation; and the classification scheduling unit is used for carrying out corresponding subsequent processing on the image content of each color taking area as a corresponding area image, and directly carrying out corresponding subsequent processing on the image content in the central area as the color taking image.
On the basis of any embodiment of the apparatus of the present application, the black band detecting module 5200 includes: a side edge determining unit, configured to determine two opposite outer side edges in the target picture as target outer side edges, where the two outer side edges are a top edge and a bottom edge in a display frame of the target picture; a communication detection unit configured to perform pixel traversal from a pixel of an outer corner on an outermost side of each of the region images on the outer side of the object, and identify a black pixel communication domain in the region image, the black pixel communication domain being formed by extending an extension width inward with the outermost side as a base line; and the black band fusion unit is used for setting the minimum width in the black pixel connected domain of each area image belonging to the same object outer side as the width of the edge black band of the object outer side, setting the length of the object outer side as the length of the edge black band, and generating coordinate information of the edge black band corresponding to the object picture.
On the basis of any embodiment of the apparatus of the present application, the image determining unit 5300 includes: a pixel detection unit configured to determine, for each region image of any of the target outer sides, whether the pixel belongs to an edge black band based on coordinate information of each pixel in image content of the region image and coordinate information of an edge black band of the target outer side; an image construction unit is arranged to construct all pixels in each area image, which do not belong to the edge black band, as a color-taking image corresponding to the area image.
On the basis of any embodiment of the apparatus of the present application, the color setting module 5400 includes: a shading unit configured to distinguish between dark pixels and non-dark pixels in the color-taken image according to a preset dark threshold; a tone determining unit configured to determine a light emission color value corresponding to a dominant tone of the color taking image from a color value of a non-darkness pixel of the color taking image; and a color value determining unit configured to generate a light emission color value of each light emitting unit in a unit frame corresponding to the color taking image in the atmosphere lamp device according to the light emission color value corresponding to the dominant color.
On the basis of any embodiment of the apparatus of the present application, prior to the operation of the region segmentation module 5100, the color capturing apparatus of the atmosphere lamp device of the present application includes: the desktop acquisition module is used for continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures; the history calling module is used for calculating the average color value of the pixels of the target picture corresponding to each history edge black band according to the preset history edge black band corresponding to the outer side of the target; an on-edge scheduling module configured to perform a subsequent process based on the historical edge black band to determine the luminescent color value for the target picture when the average color value belongs to a color value representing a black phase correspondence; and the change scheduling module is used for determining the latest edge black band based on the target picture in a subsequent processing mode when the average color value does not belong to the color value corresponding to the representation black phase, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with the historical edge black band corresponding to the interface image acquired later.
On the basis of any embodiment of the apparatus of the present application, prior to the operation of the region segmentation module 5100, the color capturing apparatus of the atmosphere lamp device of the present application includes: the desktop acquisition module is used for continuously acquiring interface images in external terminal equipment according to a preset frame rate, and taking the currently acquired interface images as target pictures; the default scheduling module is set to execute subsequent processing based on a history edge black band predetermined by the corresponding target outer side when the trigger timing reaches an event so as to determine the luminous color value for the target picture; and the timely scheduling module is used for determining the latest edge black band based on the target picture when the trigger timing reaches an event, determining the luminous color value for the target picture according to the latest edge black band, and replacing the latest edge black band with a historical edge black band corresponding to the interface image acquired later.
On the basis of any embodiment of the present application, please refer to fig. 11, another embodiment of the present application further provides a computer device, which may be used as a controller in an atmosphere lamp device, as shown in fig. 11, and an internal structure diagram of the computer device is shown. The computer device includes a processor, a computer readable storage medium, a memory, and a network interface connected by a system bus. The computer readable storage medium of the computer device stores an operating system, a database and computer readable instructions, the database can store a control information sequence, and when the computer readable instructions are executed by a processor, the processor can realize an atmosphere lamp device color-taking method. The processor of the computer device is used to provide computing and control capabilities, supporting the operation of the entire computer device. The memory of the computer device may store computer readable instructions that, when executed by the processor, may cause the processor to perform the method of color extraction of the mood light device of the present application. The network interface of the computer device is for communicating with a terminal connection. It will be appreciated by those skilled in the art that the structure shown in fig. 11 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the computer device to which the present application applies, and that a particular computer device may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
The processor in this embodiment is configured to execute specific functions of each module and its sub-module in fig. 10, and the memory stores program codes and various data required for executing the above-mentioned modules or sub-modules. The network interface is used for data transmission between the user terminal or the server. The memory in the present embodiment stores program codes and data required for executing all modules/sub-modules in the color taking device of the atmosphere lamp device of the present application, and the server can call the program codes and data of the server to execute the functions of all the sub-modules.
The present application also provides a storage medium storing computer readable instructions that, when executed by one or more processors, cause the one or more processors to perform the steps of the method for color extraction of an ambient light device according to any of the embodiments of the present application.
The present application also provides a computer program product comprising computer programs/instructions which, when executed by one or more processors, implement the steps of the method for color extraction of an ambience light device according to any of the embodiments of the present application.
Those skilled in the art will appreciate that implementing all or part of the above-described methods of embodiments of the present application may be accomplished by way of a computer program stored on a computer readable storage medium, which when executed, may comprise the steps of embodiments of the methods described above. The storage medium may be a computer readable storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a random access Memory (Random Access Memory, RAM).
The foregoing is only a partial embodiment of the present application, and it should be noted that, for a person skilled in the art, several improvements and modifications can be made without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.
In summary, the method and the device can adapt to the condition of limited calculation power in the atmosphere lamp equipment, quickly and efficiently partition and accurately take color for the target picture, generate the luminous color value of the luminous unit in each unit picture corresponding to the target picture in the atmosphere lamp, and when the corresponding lamp effect of the target picture is played, the corresponding picture quality is finer and smoother, the color is bright, and the overall picture quality is obviously improved.