[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107426142B - Image returning method, image storage method, node of Internet of things and server - Google Patents

Image returning method, image storage method, node of Internet of things and server Download PDF

Info

Publication number
CN107426142B
CN107426142B CN201611237757.8A CN201611237757A CN107426142B CN 107426142 B CN107426142 B CN 107426142B CN 201611237757 A CN201611237757 A CN 201611237757A CN 107426142 B CN107426142 B CN 107426142B
Authority
CN
China
Prior art keywords
frame image
image
current frame
similarity
node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611237757.8A
Other languages
Chinese (zh)
Other versions
CN107426142A (en
Inventor
易丙洪
林耀奎
陈镇杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Airag Technology Ltd
Original Assignee
Airag Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Airag Technology Ltd filed Critical Airag Technology Ltd
Priority to CN201611237757.8A priority Critical patent/CN107426142B/en
Publication of CN107426142A publication Critical patent/CN107426142A/en
Application granted granted Critical
Publication of CN107426142B publication Critical patent/CN107426142B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/06Protocols specially adapted for file transfer, e.g. file transfer protocol [FTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/60Network streaming of media packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an image returning method for nodes of the Internet of things. The image returning method comprises the following steps: returning the previous frame of image to a server of the Internet of things; acquiring a current frame image and the acquisition time of the current frame image; calculating the similarity between the current frame image and the previous frame image; judging whether the similarity is greater than or equal to a first preset threshold value or not; and when the similarity is larger than or equal to a first preset threshold value, sending a first notification instruction to notify the server of taking the previous frame image as the image acquired at the acquisition time of the current frame image. In addition, the invention also discloses an image storage method of the server of the Internet of things, a node of the Internet of things and the server of the Internet of things. According to the image returning method, the image storing method, the node and the server, the returned current frame image is omitted when the similarity between the current frame image and the previous frame image is larger than or equal to the first preset threshold value. Therefore, the amount of returned data can be greatly reduced, and the occupation of system resources is reduced.

Description

Image returning method, image storage method, node of Internet of things and server
Technical Field
The invention relates to the technology of the Internet of things, in particular to an image returning method, an image storage method, a node of the Internet of things and a server.
Background
The internet of things is widely applied and can even collect and return images. However, the data size of the image is large, and the returned image occupies system resources, which increases the system load.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention needs to provide an image returning method, an image storing method, a node of the internet of things and a server.
The image returning method of the nodes of the Internet of things comprises the following steps:
returning the previous frame of image to a server of the Internet of things;
acquiring a current frame image and the acquisition time of the current frame image;
calculating the similarity between the current frame image and the previous frame image;
judging whether the similarity is greater than or equal to a first preset threshold value or not; and
and when the similarity is greater than or equal to the first preset threshold value, sending a first notification instruction to notify the server of taking the previous frame image as the image acquired at the acquisition time of the current frame image.
In some embodiments, the image returning method comprises, before the step of calculating the similarity between the current frame image and the previous frame image, the following steps:
judging whether the acquisition time of the current frame image is within a preset time period;
the step of calculating the similarity between the current frame image and the previous frame image comprises:
calculating the similarity between the current frame image and the previous frame image when the acquisition time of the current frame image is within the predetermined time period.
In some embodiments, the image returning method comprises, before the step of calculating the similarity between the current frame image and the previous frame image, the following steps:
acquiring meteorological conditions at the acquisition time of the current frame image;
judging whether the meteorological conditions meet preset conditions or not;
the step of calculating the similarity between the current frame image and the previous frame image comprises:
calculating the similarity between the current frame image and the previous frame image when the weather condition satisfies the predetermined condition.
In some embodiments, the image backtransmission method further comprises the steps of:
collecting continuous multi-frame images in a preset period and correspondingly recording the time for collecting the continuous multi-frame images, wherein the continuous multi-frame images comprise the previous frame image and the current frame image.
In some embodiments, the image backtransmission method further comprises the steps of:
processing the current frame image to obtain a difference part of the current frame image and the previous frame image when the similarity is smaller than the first preset threshold;
returning the difference portion to the server; and
and sending a second notification instruction to inform the server to generate a new image according to the difference part and the previous frame image to serve as the image acquired at the acquisition time of the current frame image.
In some embodiments, the image backtransmission method further comprises the steps of:
calculating the similarity between the current frame image and a preset frame image before the current frame image;
judging whether the similarity is smaller than a second preset threshold value or not;
processing the current frame image when the similarity is smaller than the second preset threshold value to obtain a difference part between the current frame image and a preset frame image before the current frame image;
returning the difference portion to the server; and
and sending a third notification instruction to inform the server to generate a new image according to the difference part and a preset frame image before the current frame image to serve as the image acquired at the acquisition time of the current frame image.
The image storage method of the server of the Internet of things comprises the following steps:
receiving a previous frame image returned by a node of the Internet of things; and
and after receiving a first notification instruction, using the previous frame image as an image acquired at the acquisition time of the current frame image, wherein the first notification instruction is sent by the node when calculating the similarity between the current frame image and the previous frame image and judging that the similarity is greater than or equal to a first preset threshold value.
In some embodiments, the first notification instruction is sent by the node when it is determined that the acquisition time of the current frame image is within a predetermined time period and the similarity is greater than or equal to the first predetermined threshold.
In some embodiments, the first notification instruction is sent by the node when it is determined that the weather condition at the time of acquisition of the current frame image satisfies a predetermined condition and the similarity is greater than or equal to the first predetermined threshold.
In some embodiments, the image saving method comprises the steps of:
receiving a difference part between the current frame image and the previous frame image returned by the node, wherein the difference part is obtained by processing the current frame image by the node when the similarity is smaller than the first preset threshold;
and after receiving a second notification instruction, generating a new image according to the difference part and the previous frame image to serve as an image acquired at the acquisition time of the current frame image, wherein the second notification instruction is sent by the node when the difference part is transmitted back to the server.
In some embodiments, the image saving method comprises the steps of:
receiving a difference part between the current frame image returned by the node and a preset frame image before the current frame image, wherein the difference part is obtained by processing the current frame image when the node calculates the similarity between the current frame image and the preset frame image before the current frame image and judges that the similarity is smaller than a second preset threshold value; and
and after a third notification instruction is received, generating a new image according to the difference part and a preset frame image before the current frame image to be used as an image acquired at the acquisition time of the current frame image, wherein the third notification instruction is sent by the node after the difference part is transmitted back to the server.
The node of the internet of things in the embodiment of the invention comprises:
the first feedback module is used for returning the previous frame of image to a server of the Internet of things;
the first acquisition module is used for acquiring a current frame image and the acquisition time of the current frame image;
a first calculating module, configured to calculate a similarity between the current frame image and the previous frame image;
the first judging module is used for judging whether the similarity is greater than or equal to a first preset threshold value or not; and
a first notification module, configured to send a first notification instruction to notify the server that the previous frame of image is taken as an image acquired at the acquisition time of the current frame of image when the similarity is greater than or equal to the first predetermined threshold.
In some embodiments, the node further comprises:
the second judging module is used for judging whether the acquisition time of the current frame image is within a preset time period;
the first calculating module is further configured to calculate a similarity between the current frame image and the previous frame image when the second determining module determines that the acquisition time of the current frame image is within the predetermined time period.
In some embodiments, the node further comprises:
a second obtaining module, configured to obtain a weather condition at a time of acquiring the current frame image; and
the third judgment module is used for judging whether the meteorological conditions meet the preset conditions or not;
the first calculating module is further configured to calculate a similarity between the current frame image and the previous frame image when the third determining module determines that the weather condition satisfies the predetermined condition.
In some embodiments, the node further comprises:
the acquisition module is used for acquiring continuous multi-frame images in a preset period and correspondingly recording the time for acquiring the continuous multi-frame images, and the continuous multi-frame images comprise the previous frame image and the current frame image.
In some embodiments, the node further comprises:
a first processing module, configured to process the current frame image to obtain a difference portion between the current frame image and the previous frame image when the similarity is smaller than the first predetermined threshold;
a second backhaul module, configured to backhaul the difference portion to the server; and
a second notification module, configured to send a second notification instruction to notify the server to generate a new image according to the difference portion and the previous frame image, where the new image is used as an image acquired at the acquisition time of the current frame image.
In some embodiments, the node further comprises:
a second calculating module, configured to calculate a similarity between the current frame image and a predetermined frame image before the current frame image;
the fourth judging module is used for judging whether the similarity is smaller than a second preset threshold value or not;
a second processing module, configured to process the current frame image when the similarity is smaller than the second predetermined threshold to obtain a difference portion between the current frame image and a predetermined frame image before the current frame image;
a third backhaul module, configured to backhaul the difference portion to the server; and
a third notification module, configured to send a third notification instruction to notify the server to generate a new image according to the difference portion and a predetermined frame image before the current frame image, where the new image is used as an image acquired at the acquisition time of the current frame image.
The server of the internet of things in the embodiment of the invention comprises:
the first receiving module is used for receiving a previous frame of image returned by a node of the Internet of things; and
the first storage module is used for taking the previous frame image as an image acquired at the acquisition time of the current frame image after receiving a first notification instruction, wherein the first notification instruction is sent by the node when the similarity between the current frame image and the previous frame image is calculated and the similarity is judged to be greater than or equal to a first preset threshold value.
In some embodiments, the first notification instruction is sent by the node when it is determined that the acquisition time of the current frame image is within a predetermined time period and the similarity is greater than or equal to the first predetermined threshold.
In some embodiments, the first notification instruction is sent by the node when it is determined that the weather condition at the time of acquisition of the current frame image satisfies a predetermined condition and the similarity is greater than or equal to the first predetermined threshold.
In some embodiments, the server further comprises:
a second receiving module, configured to receive a difference portion between the current frame image and the previous frame image returned by the node, where the difference portion is obtained by processing the current frame image by the node when the similarity is smaller than the first predetermined threshold; and
and the second storage module is used for generating a new image according to the difference part and the previous frame image after receiving a second notification instruction, wherein the new image is used as an image acquired at the acquisition time of the current frame image, and the second notification instruction is sent by the node after the difference part is transmitted back to the server.
In some embodiments, the server further comprises:
a third receiving module, configured to receive a difference portion between the current frame image returned by the node and a predetermined frame image before the current frame image, where the difference portion is obtained by processing the current frame image when the node calculates a similarity between the current frame image and the predetermined frame image before the current frame image and determines that the similarity is smaller than a second predetermined threshold; and
and the third storage module is used for generating a new image according to the difference part and a preset frame image before the current frame image after receiving a third notification instruction, wherein the new image is used as an image acquired at the acquisition time of the current frame image, and the third notification instruction is sent by the node after the difference part is transmitted back to the server.
The image returning method, the image storing method, the node and the server in the embodiment of the invention omit returning the current frame image when the similarity is more than or equal to the first preset threshold value by judging the similarity between the current frame image and the previous frame image, and directly inform the server that the previous frame image is taken as the image acquired at the acquisition time of the current frame image. Therefore, the amount of returned data can be greatly reduced, the occupied system resources are reduced, and the system burden is reduced.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow chart of an image backhaul method for a node of an internet of things according to an embodiment of the present invention.
Fig. 2 is a schematic flow chart of an image saving method for a server of the internet of things according to an embodiment of the present invention.
Fig. 3 is a functional module schematic diagram of the internet of things according to the embodiment of the invention.
Fig. 4 is a schematic diagram of the internet of things in accordance with certain embodiments of the invention.
Fig. 5 is a flowchart illustrating an image backhaul method for a node of the internet of things according to some embodiments of the present invention.
Fig. 6 is a functional block diagram of the internet of things in accordance with some embodiments of the present invention.
Fig. 7 is a flowchart illustrating an image backhaul method for a node of the internet of things according to some embodiments of the present invention.
Fig. 8 is a functional block diagram of the internet of things in accordance with certain embodiments of the present invention.
Fig. 9 is a flowchart illustrating an image backhaul method for a node of the internet of things according to some embodiments of the present invention.
Fig. 10 is a functional block diagram of the internet of things in accordance with certain embodiments of the present invention.
Fig. 11 is a flowchart illustrating an image backhaul method for a node of the internet of things according to some embodiments of the present invention.
Fig. 12 is a flowchart illustrating an image saving method of a server of the internet of things according to some embodiments of the present invention.
Fig. 13 is a functional block diagram of the internet of things in accordance with certain embodiments of the present invention.
Fig. 14 is a schematic view of the operating state of the internet of things in accordance with certain embodiments of the invention.
Fig. 15 is a flowchart illustrating an image backhaul method for a node of the internet of things according to some embodiments of the present invention.
Fig. 16 is a flowchart illustrating an image saving method of a server of the internet of things according to some embodiments of the present invention.
Fig. 17 is a functional block diagram of the internet of things in accordance with certain embodiments of the present invention.
Description of the main elements and symbols:
the node 10, the sensor component 101, the imaging device 1011, the control device 102, the wireless transceiver 103, the first feedback module 11, the first obtaining module 12, the first calculating module 13, the first judging module 14, the first notifying module 15, the second judging module 16, the second obtaining module 17, the third judging module 18, the collecting module 19, the first processing module 20, the second feedback module 21, the second notifying module 22, the second calculating module 23, the fourth judging module 24, the second processing module 25, the third feedback module 26 and the third notifying module 27;
the server 30, the first receiving module 31, the first storage module 32, the second receiving module 33, the second storage module 34, the second receiving module 35, and the second storage module 36;
the internet of things 100.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are exemplary only for the purpose of illustrating the embodiments of the present invention and are not to be construed as limiting the embodiments of the present invention.
Referring to fig. 1, an image returning method for a node of the internet of things according to an embodiment of the present invention includes the following steps:
s11: returning the previous frame of image to a server of the Internet of things;
s12: acquiring a current frame image and the acquisition time of the current frame image;
s13: calculating the similarity between the current frame image and the previous frame image;
s14: judging whether the similarity is greater than or equal to a first preset threshold value or not; and
s15: and when the similarity is greater than or equal to a first preset threshold value, sending a first notification instruction to notify a server of taking the previous frame image as the image acquired at the acquisition time of the current frame image.
Referring to fig. 2, an image saving method for a server of the internet of things according to an embodiment of the present invention includes the following steps:
s31: receiving a previous frame image returned by a node of the Internet of things; and
s32: and after receiving a first notification instruction, using the previous frame image as an image acquired at the acquisition time of the current frame image, wherein the first notification instruction is sent by the node when the similarity between the current frame image and the previous frame image is calculated and the similarity is judged to be more than or equal to a first preset threshold value.
Referring to fig. 3, the internet of things 100 according to the embodiment of the present invention includes nodes 10 and servers 30.
The node 10 of the embodiment of the present invention includes a first feedback module 11, a first obtaining module 12, a first calculating module 13, a first determining module 14, and a first notifying module 15. The image returning method according to the embodiment of the present invention may be implemented by the node 10 according to the embodiment of the present invention. For example, S11 may be implemented by the first returning module 11, S12 may be implemented by the first obtaining module 12, S13 may be implemented by the first calculating module 13, S14 may be implemented by the first judging module 14, and S15 may be implemented by the first notifying module 15.
That is, the first returning module 11 may be used to return the previous frame of image to the server 30. The first obtaining module 12 may be configured to obtain the current frame image and the acquisition time of the current frame image. The first calculating module 13 may be configured to calculate a similarity between the current frame image and the previous frame image. The first determining module 14 may be configured to determine whether the similarity is greater than or equal to a first predetermined threshold. The first notification module 15 may be configured to send a first notification instruction to notify the server 30 that the previous frame image is taken as the image acquired at the acquisition time of the current frame image when the similarity is greater than or equal to the first predetermined threshold.
The server 30 of the embodiment of the present invention includes a first receiving module 31 and a first storing module 32. The image saving method according to the embodiment of the present invention can be realized by the server 30 according to the embodiment of the present invention. For example, S31 may be implemented by the first receiving module 31, and S32 may be implemented by the first storing module 32.
That is, the first receiving module 31 may be configured to receive a previous frame image returned by the node 10. The first storage module 32 may be configured to use, after receiving a first notification instruction, the previous frame image as an image acquired at the acquisition time of the current frame image, where the first notification instruction is sent by the node 10 when calculating the similarity between the current frame image and the previous frame image and judging that the similarity is greater than or equal to a first predetermined threshold.
It can be understood that the image returning method may be cooperated with the image saving method, and implemented by the node 10 and the server 30, respectively, to implement the image returning function of the internet of things 100.
In this way, the image returning method, the image saving method, the node 10, the server 30 and the internet of things 100 according to the embodiments of the present invention determine the similarity between the current frame image and the previous frame image, and omit returning the current frame image when the similarity is greater than or equal to the first predetermined threshold, and directly notify the server 30 that the previous frame image is the image acquired at the acquisition time of the current frame image. Therefore, the amount of returned data can be greatly reduced, the occupied system resources are reduced, and the system burden is reduced.
Referring to fig. 4, the image returning method, the image saving method, the node 10 and the server 30 according to the embodiment of the present invention may be applied to an agricultural internet of things, that is, the internet of things 100 may be an agricultural internet of things.
It will be appreciated that the agricultural internet of things may include a plurality of nodes 10, with the nodes 10 being distributed as required throughout the farm to form a monitoring network. The nodes 10 can collect relevant information in the farm in real time and then transmit the relevant information back to the server 30, so that personnel can find problems in time, and the effects of increasing yield, improving quality, adjusting growth cycle and improving economic benefit are achieved.
Specifically, the node 10 may include a sensor assembly 101 disposed in the farmland, such as a wind direction and speed sensor, a rain gauge, a temperature and humidity sensor, a soil ph sensor, and the like. Thus, the node 10 can also be used for collecting wind direction, wind speed, rainfall, temperature, humidity, soil acidity and alkalinity, and the like.
In addition to this, the node 10 may comprise control means 102 and radio transceiver means 103. The control device 102 is in communication with the sensor assembly 101 and the wireless transceiver 103, and is used for controlling the wireless transceiver 103 to transmit and receive data collected by the sensor assembly 101, and also to receive and execute instructions from the server 30 received by the wireless transceiver 103.
The control means 102 may be a chip or a circuit board provided with a chip, and the radio transmitting and receiving means 103 may be an antenna.
In some embodiments, the node 10 may further comprise a housing (not shown), the sensor assembly 101 and the transceiver 103 may be at least partially mounted within the housing, and the control device 102 may be disposed within the housing.
Referring to fig. 5, in some embodiments, the image returning method includes, before S13:
s16: judging whether the acquisition time of the current frame image is within a preset time period;
s13 includes:
and calculating the similarity between the current frame image and the previous frame image when the acquisition time of the current frame image is within a preset time period.
Referring to fig. 6, in some embodiments, the node 10 further includes a second determining module 16. S16 may be implemented by the second decision module 16.
That is, the second determining module 16 may be configured to determine whether the acquisition time of the current frame image is within a predetermined time period. The first calculating module 13 may be further configured to calculate a similarity between the current frame image and the previous frame image when the second determining module 16 determines that the acquisition time of the current frame image is within the predetermined time period.
In some embodiments, the first notification instruction is sent by the node 10 when it is determined that the acquisition time of the current frame image is within the predetermined time period and the similarity is greater than or equal to the first predetermined threshold.
Specifically, the predetermined period of time may be a period of time from the sunset to the next day when the sun rises. For example, from 6 pm on the first day to 6 am on the second day. It can be understood that the light is poor in this period of time, the similarity between the collected images is high, and if the node 10 uploads each frame of image with almost no difference to the server 30, the traffic cost of data transmission is easily wasted. Therefore, the node 10 may perform the step of calculating the similarity between the current frame image and the previous frame image when it is determined that the acquisition time of the current frame image is within the predetermined time period, and omit the return of the current frame image when the similarity is greater than or equal to the first predetermined threshold, and send the first notification instruction to notify the server 30 that the previous frame image is the image acquired at the acquisition time of the current frame image.
Of course, the setting of the predetermined time period may also be adjusted according to actual conditions. For example, different predetermined time periods may be set for different regions, different seasons, etc. of the display.
Referring to fig. 7, in some embodiments, the image returning method includes, before S13:
s17: acquiring meteorological conditions at the acquisition time of a current frame image;
s18: judging whether the meteorological conditions meet preset conditions or not;
s13 includes:
and calculating the similarity between the current frame image and the previous frame image when the weather condition meets a preset condition.
Referring to fig. 8, in some embodiments, the node 10 further includes a second obtaining module 17 and a third determining module 18. The S17 may be implemented by the second obtaining module 17, and the S18 may be implemented by the third determining module 18.
That is, the second obtaining module 17 may be used to obtain the weather condition at the time of acquisition of the current frame image. The third determination module 18 is used for determining whether the weather condition satisfies the predetermined condition. The first calculating module 13 can be further configured to calculate the similarity between the current frame image and the previous frame image when the third determining module 18 determines that the weather condition satisfies the predetermined condition.
In some embodiments, the first notification instruction is sent by the node 10 when it is determined that the weather condition at the time of acquisition of the current frame image satisfies the predetermined condition and the similarity is greater than or equal to the first predetermined threshold.
Specifically, the predetermined condition may be a weather condition that is apt to affect the vision, such as a sand storm, a snowstorm, haze, and rainstorm. Taking a tea garden as an example, when the weather information acquired by the node 10 is that the tea garden is fogged on the same day, the weather condition satisfies the predetermined condition. It can be understood that under such meteorological conditions, the similarity between the collected images is high, and if the node 10 uploads each frame of image with almost no difference to the server 30, the traffic cost of data transmission is wasted. Thus, the node 10 may perform the step of calculating the similarity between the current frame image and the previous frame image when it is determined that the weather condition at the time of acquiring the current frame image satisfies the predetermined condition, and omit the return of the current frame image when the similarity is greater than or equal to the first predetermined threshold, and send the first notification instruction to notify the server 30 that the previous frame image is the image acquired at the time of acquiring the current frame image.
Referring to fig. 9, in some embodiments, the image returning method further includes the following steps:
s19: collecting continuous multi-frame images in a preset period and correspondingly recording the time for collecting the continuous multi-frame images, wherein the continuous multi-frame images comprise a previous frame image and a current frame image.
Referring to fig. 10, in some embodiments, the node 10 includes an acquisition module 19. S19 may be implemented by acquisition module 19.
That is, the acquiring module 19 may be configured to acquire a plurality of consecutive images at a predetermined period and correspondingly record the time for acquiring the plurality of consecutive images, where the plurality of consecutive images includes a previous image and a current image.
It is understood that in practice, the sensor assembly 101 may include an imaging device 1011, the imaging device 1011 being configured to capture images of the agricultural field at predetermined periods. The image of the field may include an image of the weather of the field, an image of the soil, an image of the crop, and the like.
In this way, the image is acquired at the predetermined period because the imaging device 1011 can be prevented from operating without interruption, thereby reducing the power consumption of the node 10, which is very important because the node 10 is generally powered by a battery or a solar cell. The value of the predetermined period may be set for different objects or time periods. For example, if some crops in a farm field grow slowly at night, the imaging device 1011 disposed in the farm for capturing images of the growth state of the crops may be set to be larger at a predetermined period of the night, and thus, unnecessary energy consumption may be reduced.
In certain embodiments, the predetermined period may be 10-20 minutes.
For example, when the predetermined period is 15 minutes, the acquisition module 15 acquires and outputs the image a1 at 10 points, and acquires and outputs the image a2 at 10 points and 15 minutes. Thus, image A1 was the previous frame image, and node 10 returned image A1 to server 30, and then calculated the similarity of image A1 and image A2. When the similarity is equal to or greater than the first predetermined threshold, a first notification instruction is sent to notify the server 30 of the image a1 as an image separately captured at point 10 and point 15.
S32 may also be such that the server 30, upon receiving the first notification instruction, copies the previous frame image a1 into an image a1 'and then saves a 1' as the image captured at the capture time of the current frame image.
In this way, images at various time points can be stored in the server 30, so that dynamic changes of the internet of things 100 can be monitored. For example, from the change in the image of the crop in the farm at each point in time, the growth tendency of the crop can be understood.
Referring to fig. 11, in some embodiments, the image returning method further includes the following steps:
s20: processing the current frame image when the similarity is smaller than a first preset threshold value to obtain a difference part between the current frame image and the previous frame image;
s21: passing back the difference portion to the server; and
s22: sending a second notification instruction to inform the server to generate a new image as an image acquired at the acquisition time of the current frame image based on the difference portion and the previous frame image.
Referring to fig. 12, in some embodiments, the image saving method further includes the following steps:
s33: receiving a difference part between a current frame image and a previous frame image returned by the node, wherein the difference part is obtained by processing the current frame image when the similarity of the node is smaller than a first preset threshold value;
s34: and after receiving a second notification instruction, generating a new image according to the difference part and the previous frame image to be used as an image acquired at the acquisition time of the current frame image, wherein the second notification instruction is sent by the node when the difference part is returned to the server.
Referring to fig. 13, in some embodiments, the node 10 further includes a first processing module 20, a second backhaul module 21, and a second notification module 22. S20 may be implemented by the first processing module 20, S21 may be implemented by the second backhaul module 21, and S22 may be implemented by the second notification module 22.
That is, the first processing module 20 may be configured to process the current frame image to obtain a difference portion between the current frame image and the previous frame image when the similarity is smaller than the first predetermined threshold. The second backhaul module 21 may be used to backhaul the difference portion to the server 30. The second notification module 22 may be configured to send a second notification instruction to notify the server 30 to generate a new image as an image acquired at the acquisition time of the current frame image according to the difference portion and the previous frame image.
In some embodiments, the server 30 further comprises a second receiving module 33 and a second storing module 34. S33 may be implemented by the second receiving module 33, and S34 may be implemented by the second storing module 34.
That is, the second receiving module 33 may be configured to receive a difference portion between the current frame image and the previous frame image returned by the node 10. The difference is derived in part by the node 10 processing the current frame image when the similarity is less than a first predetermined threshold. The second storage module 34 may be configured to generate a new image as an image acquired at the acquisition time of the current frame image according to the difference portion and the previous frame image after receiving the second notification instruction. The second notification instruction is sent by the node 10 when the difference portion is returned to the server 30.
Thus, when the similarity between the current frame image and the previous frame image is low, that is, the difference is large, the node 10 only needs to return the difference part between the current frame image and the previous frame image, so that the same part in the image is prevented from being repeatedly uploaded, and the traffic cost of data transmission is further saved.
For example, referring to FIG. 14, the acquisition module 15 acquires and outputs image A1 at 10 points and acquires and outputs image A2 at 10 points and 15 points. Thus, image A1 was the previous frame image, and node 10 returned image A1 to server 30, and then calculated the similarity of image A1 and image A2. When the similarity is less than the predetermined threshold, the node 10 processes the current frame image a2 to obtain a difference portion a2 'of the current frame image a2 and the previous frame image a1, and returns the difference portion a 2' to the server 30. The server 30 copies the uploaded image a1 to an image a1 ', then merges the difference portion a2 ' into the image a1 ' to an image a2 ", and saves the image a 2" as a 10-point-15-point image.
Referring to fig. 15, in some embodiments, the image returning method further includes the following steps:
s23: calculating the similarity between the current frame image and a preset frame image before the current frame image;
s24: judging whether the similarity is smaller than a second preset threshold value or not;
s25: processing the current frame image when the similarity is smaller than a second preset threshold value to obtain a difference part between the current frame image and a preset frame image before the current frame image;
s26: passing back the difference portion to the server; and
s27: sending a third notification instruction to notify the server to generate a new image as an image acquired at the acquisition time of the current frame image based on the difference part and a predetermined frame image preceding the current frame image.
Referring to fig. 16, in some embodiments, the image saving method further includes the following steps:
s35: receiving a difference part between a current frame image returned by the node and a preset frame image before the current frame image, wherein the difference part is obtained by processing the current frame image when the node calculates the similarity between the current frame image and the preset frame image before the current frame image and judges that the similarity is less than a second preset threshold value; and
s36: and after receiving a third notification instruction, generating a new image according to the difference part and a preset frame image before the current frame image to be used as an image acquired at the acquisition time of the current frame image, wherein the third notification instruction is sent by the node when the difference part is returned to the server.
Referring to fig. 17, in some embodiments, the node 10 further includes a second calculating module 23, a fourth determining module 24, a second processing module 25, a third returning module 26, and a third notifying module 27. S23 may be implemented by the second calculating module 23, S24 may be implemented by the fourth determining module 24, S25 may be implemented by the second processing module 25, S26 may be implemented by the third backhaul module 23, and S27 may be implemented by the third notifying module 27.
That is, the second calculating module 23 may be configured to calculate a similarity between the current frame image and a predetermined frame image before the current frame image. The fourth determining module 24 may be configured to determine whether the similarity is smaller than a second predetermined threshold. The second processing module 25 may be configured to process the current frame image when the similarity is smaller than a second predetermined threshold to obtain a difference portion between the current frame image and a predetermined frame image before the current frame image. The third backhaul module 26 can be used to backhaul the differential portion to the server 30. The third notification module 27 may be configured to send a third notification instruction to notify the server 30 to generate a new image as an image acquired at the acquisition time of the current frame image according to the difference portion and a predetermined frame image before the current frame image.
In certain embodiments, the server 30 further comprises a third receiving module 35 and a third storing module 36. S35 may be implemented by the third receiving module 35, and S36 may be implemented by the third storing module 36.
That is, the third receiving module 35 may be configured to receive a difference portion between the current frame image returned by the node 10 and a predetermined frame image before the current frame image. The difference portion is obtained by the node 10 processing the current frame image when calculating the similarity between the current frame image and the predetermined frame image before the current frame image and judging that the similarity is less than the second predetermined threshold. The third storage module 36 may be configured to generate a new image as an image acquired at the acquisition time of the current frame image according to the difference portion and a predetermined frame image before the current frame image after receiving the third notification instruction. The third notification instruction is sent by the node 10 when the difference portion is returned to the server 30.
When the similarity between the current frame image and the previous frame image is always high, the system always uses the previous frame image as the image acquired by the acquisition time of the current frame image, and a certain difference may be accumulated between the current frame image and the initial frame image. For example, when the similarity between the 2 nd frame image and the 1 st frame image is greater than a first predetermined threshold, the 1 st frame image may be taken as an image taken at the acquisition time of the 2 nd frame image. If the similarity between each current frame image and the previous frame image is greater than the first predetermined threshold value until the 8 th frame image continues, the images stored on the acquisition time server 30 from the 1 st frame image to the 8 th frame image are all the 1 st frame images. However, the similarity between the 8 th frame image and the 1 st frame image may already be less than the first predetermined threshold. Therefore, the step of calculating the similarity between the current frame image and the predetermined frame image before the current frame image may be added on the basis of calculating the similarity between the current frame image and the previous frame image, and a second predetermined threshold may be set. The second predetermined threshold may be greater than or equal to the first predetermined threshold. For example, when the current frame image reaches the 20 th frame and the similarity between two adjacent frame images in the previous 20 frame images is greater than or equal to the first predetermined threshold, the step of calculating the similarity between the 20 th frame image and the 1 st frame image may be performed to prevent the accumulated difference between the images from being large and the updated image is not uploaded. When the similarity between the 20 th frame image and the 1 st frame image is smaller than a second preset threshold value, the node uploads the difference part between the 20 th frame image and the 1 st frame image, and then the difference part is combined with the 1 st frame image to serve as an image acquired at the acquisition time of the 20 th frame image, and the detailed description is omitted.
In some embodiments, the server 30 may be communicatively coupled to the terminal. The terminal can be a mobile terminal such as a mobile phone, a tablet computer or wearable equipment, and can also be other terminals such as a personal computer. The person can read the information on the server 30 from the terminal to grasp the relevant situation of the farm at any time and any place. Especially in certain inconvenient situations, such as at night or when weather conditions are poor and the like are not suitable for work, the person can use the terminal to inquire about the required information through the server 30.
In the description of the embodiments of the present invention, it should be understood that the terms "center", "longitudinal", "lateral", "length", "width", "thickness", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", "clockwise", "counterclockwise", and the like indicate orientations or positional relationships based on those shown in the drawings, and are only for convenience of describing the embodiments of the present invention and simplifying the description, but do not indicate or imply that the device or element referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the embodiments of the present invention. Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, features defined as "first", "second", may explicitly or implicitly include one or more of the described features. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
In the description of the embodiments of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as being fixedly connected, detachably connected, or integrally connected; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. Specific meanings of the above terms in the embodiments of the present invention can be understood by those of ordinary skill in the art according to specific situations.
In embodiments of the invention, unless expressly stated or limited otherwise, the first feature "on" or "under" the second feature may comprise the first and second features being in direct contact, or the first and second features being in contact, not directly, but via another feature therebetween. Also, the first feature being "on," "above" and "over" the second feature includes the first feature being directly on and obliquely above the second feature, or merely indicating that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature includes the first feature being directly under and obliquely below the second feature, or simply meaning that the first feature is at a lesser elevation than the second feature.
The above disclosure provides many different embodiments or examples for implementing different configurations of embodiments of the invention. In order to simplify the disclosure of embodiments of the invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, embodiments of the invention may repeat reference numerals and/or reference letters in the various examples, which have been repeated for purposes of simplicity and clarity and do not in themselves dictate a relationship between the various embodiments and/or arrangements discussed. In addition, embodiments of the present invention provide examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or use of other materials.
In the description herein, references to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection having one or more wires (control method), a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of embodiments of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (14)

1. An image returning method for a node of an internet of things is characterized by comprising the following steps:
returning the previous frame of image to a server of the Internet of things;
acquiring a current frame image and the acquisition time of the current frame image;
calculating the similarity between the current frame image and the previous frame image;
judging whether the similarity is greater than or equal to a first preset threshold value or not; and
when the similarity is greater than or equal to the first preset threshold, sending a first notification instruction to notify the server that the previous frame of image is taken as an image acquired at the acquisition time of the current frame of image;
the image returning method comprises the following steps before the step of calculating the similarity between the current frame image and the previous frame image:
judging whether the acquisition time of the current frame image is within a preset time period;
the step of calculating the similarity between the current frame image and the previous frame image comprises:
calculating the similarity between the current frame image and the previous frame image when the acquisition time of the current frame image is within the predetermined time period;
the preset time period is the time period from the sunset to the rising of the sun on the next day;
the preset time period is set according to different regions and different seasons;
the image returning method comprises the following steps before the step of calculating the similarity between the current frame image and the previous frame image:
acquiring meteorological conditions at the acquisition time of the current frame image;
judging whether the meteorological conditions meet preset conditions or not;
the step of calculating the similarity between the current frame image and the previous frame image comprises:
calculating the similarity between the current frame image and the previous frame image when the weather condition satisfies the predetermined condition.
2. The image backtransmission method according to claim 1, wherein the image backtransmission method further comprises the steps of:
collecting continuous multi-frame images in a preset period and correspondingly recording the time for collecting the continuous multi-frame images, wherein the continuous multi-frame images comprise the previous frame image and the current frame image.
3. The image backtransmission method according to claim 1, wherein the image backtransmission method further comprises the steps of:
processing the current frame image to obtain a difference part of the current frame image and the previous frame image when the similarity is smaller than the first preset threshold;
returning the difference portion to the server; and
and sending a second notification instruction to inform the server to generate a new image according to the difference part and the previous frame image to serve as the image acquired at the acquisition time of the current frame image.
4. The image backtransmission method according to claim 1, wherein the image backtransmission method further comprises the steps of:
calculating the similarity between the current frame image and a preset frame image before the current frame image;
judging whether the similarity is smaller than a second preset threshold value or not;
processing the current frame image when the similarity is smaller than the second preset threshold value to obtain a difference part between the current frame image and a preset frame image before the current frame image;
returning the difference portion to the server; and
and sending a third notification instruction to inform the server to generate a new image according to the difference part and a preset frame image before the current frame image to serve as the image acquired at the acquisition time of the current frame image.
5. An image saving method for a server of the Internet of things is characterized by comprising the following steps:
receiving a previous frame image returned by a node of the Internet of things; and
after receiving a first notification instruction, taking the previous frame image as an image acquired at the acquisition time of the current frame image, wherein the first notification instruction is sent by the node when calculating the similarity between the current frame image and the previous frame image and judging that the similarity is greater than or equal to a first preset threshold value;
the first notification instruction is sent by the node when the acquisition time of the current frame image is judged to be within a preset time period and the similarity is more than or equal to the first preset threshold value;
the preset time period is the time period from the sunset to the rising of the sun on the next day;
the preset time period is set according to different regions and different seasons;
the first notification instruction is sent by the node when the weather condition at the acquisition time of the current frame image is judged to meet a preset condition and the similarity is greater than or equal to the first preset threshold value.
6. The image saving method according to claim 5, wherein the image saving method comprises the steps of:
receiving a difference part between the current frame image and the previous frame image returned by the node, wherein the difference part is obtained by processing the current frame image by the node when the similarity is smaller than the first preset threshold;
and after receiving a second notification instruction, generating a new image according to the difference part and the previous frame image to serve as an image acquired at the acquisition time of the current frame image, wherein the second notification instruction is sent by the node when the difference part is transmitted back to the server.
7. The image saving method according to claim 5, wherein the image saving method comprises the steps of:
receiving a difference part between the current frame image returned by the node and a preset frame image before the current frame image, wherein the difference part is obtained by processing the current frame image when the node calculates the similarity between the current frame image and the preset frame image before the current frame image and judges that the similarity is smaller than a second preset threshold value; and
and after a third notification instruction is received, generating a new image according to the difference part and a preset frame image before the current frame image to be used as an image acquired at the acquisition time of the current frame image, wherein the third notification instruction is sent by the node after the difference part is transmitted back to the server.
8. A node of the internet of things, the node comprising:
the first feedback module is used for returning the previous frame of image to a server of the Internet of things;
the first acquisition module is used for acquiring a current frame image and the acquisition time of the current frame image;
a first calculating module, configured to calculate a similarity between the current frame image and the previous frame image;
the first judging module is used for judging whether the similarity is greater than or equal to a first preset threshold value or not; and
a first notification module, configured to send a first notification instruction to notify the server that the previous frame of image is taken as an image acquired at the acquisition time of the current frame of image when the similarity is greater than or equal to the first predetermined threshold;
the node further comprises:
the second judging module is used for judging whether the acquisition time of the current frame image is within a preset time period;
the first calculating module is further configured to calculate a similarity between the current frame image and the previous frame image when the second determining module determines that the acquisition time of the current frame image is within the predetermined time period;
the preset time period is the time period from the sunset to the rising of the sun on the next day;
the preset time period is set according to different regions and different seasons;
the node further comprises:
a second obtaining module, configured to obtain a weather condition at a time of acquiring the current frame image; and
the third judgment module is used for judging whether the meteorological conditions meet the preset conditions or not;
the first calculating module is further configured to calculate a similarity between the current frame image and the previous frame image when the third determining module determines that the weather condition satisfies the predetermined condition.
9. The node of the internet of things of claim 8, further comprising:
the acquisition module is used for acquiring continuous multi-frame images in a preset period and correspondingly recording the time for acquiring the continuous multi-frame images, and the continuous multi-frame images comprise the previous frame image and the current frame image.
10. The node of the internet of things of claim 8, further comprising:
a first processing module, configured to process the current frame image to obtain a difference portion between the current frame image and the previous frame image when the similarity is smaller than the first predetermined threshold;
a second backhaul module, configured to backhaul the difference portion to the server; and
a second notification module, configured to send a second notification instruction to notify the server to generate a new image according to the difference portion and the previous frame image, where the new image is used as an image acquired at the acquisition time of the current frame image.
11. The node of the internet of things of claim 8, further comprising:
a second calculating module, configured to calculate a similarity between the current frame image and a predetermined frame image before the current frame image;
the fourth judging module is used for judging whether the similarity is smaller than a second preset threshold value or not;
a second processing module, configured to process the current frame image when the similarity is smaller than the second predetermined threshold to obtain a difference portion between the current frame image and a predetermined frame image before the current frame image;
a third backhaul module, configured to backhaul the difference portion to the server; and
a third notification module, configured to send a third notification instruction to notify the server to generate a new image according to the difference portion and a predetermined frame image before the current frame image, where the new image is used as an image acquired at the acquisition time of the current frame image.
12. A server of the internet of things, the server comprising:
the first receiving module is used for receiving a previous frame of image returned by a node of the Internet of things; and
the first storage module is used for taking the previous frame image as an image acquired at the acquisition time of the current frame image after receiving a first notification instruction, wherein the first notification instruction is sent by the node when the similarity between the current frame image and the previous frame image is calculated and the similarity is judged to be more than or equal to a first preset threshold value;
the first notification instruction is sent by the node when the acquisition time of the current frame image is judged to be within a preset time period and the similarity is more than or equal to the first preset threshold value;
the preset time period is the time period from the sunset to the rising of the sun on the next day;
the preset time period is set according to different regions and different seasons;
the first notification instruction is sent by the node when the weather condition at the acquisition time of the current frame image is judged to meet a preset condition and the similarity is greater than or equal to the first preset threshold value.
13. The server of the internet of things of claim 12, wherein the server further comprises:
a second receiving module, configured to receive a difference portion between the current frame image and the previous frame image returned by the node, where the difference portion is obtained by processing the current frame image by the node when the similarity is smaller than the first predetermined threshold; and
and the second storage module is used for generating a new image according to the difference part and the previous frame image after receiving a second notification instruction, wherein the new image is used as an image acquired at the acquisition time of the current frame image, and the second notification instruction is sent by the node after the difference part is transmitted back to the server.
14. The server of the internet of things of claim 12, wherein the server further comprises:
a third receiving module, configured to receive a difference portion between the current frame image returned by the node and a predetermined frame image before the current frame image, where the difference portion is obtained by processing the current frame image when the node calculates a similarity between the current frame image and the predetermined frame image before the current frame image and determines that the similarity is smaller than a second predetermined threshold; and
and the third storage module is used for generating a new image according to the difference part and a preset frame image before the current frame image after receiving a third notification instruction, wherein the new image is used as an image acquired at the acquisition time of the current frame image, and the third notification instruction is sent by the node after the difference part is transmitted back to the server.
CN201611237757.8A 2016-12-28 2016-12-28 Image returning method, image storage method, node of Internet of things and server Active CN107426142B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611237757.8A CN107426142B (en) 2016-12-28 2016-12-28 Image returning method, image storage method, node of Internet of things and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611237757.8A CN107426142B (en) 2016-12-28 2016-12-28 Image returning method, image storage method, node of Internet of things and server

Publications (2)

Publication Number Publication Date
CN107426142A CN107426142A (en) 2017-12-01
CN107426142B true CN107426142B (en) 2020-04-21

Family

ID=60422917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611237757.8A Active CN107426142B (en) 2016-12-28 2016-12-28 Image returning method, image storage method, node of Internet of things and server

Country Status (1)

Country Link
CN (1) CN107426142B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111314395B (en) * 2018-12-11 2023-07-11 中兴通讯股份有限公司 Image transmission method, terminal and storage medium
CN110689090A (en) * 2019-10-14 2020-01-14 北京百度网讯科技有限公司 Image storage method and device
CN111654699B (en) * 2020-05-29 2024-05-17 西安万像电子科技有限公司 Image transmission method and device
CN111787381A (en) * 2020-06-24 2020-10-16 北京声迅电子股份有限公司 Uploading method and uploading device for images collected by security check machine

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101715112A (en) * 2009-12-14 2010-05-26 中兴通讯股份有限公司 Video monitoring system and method
CN103217937A (en) * 2012-05-31 2013-07-24 山东电力集团公司青岛供电公司 Power distribution station house monitoring system
CN103634556A (en) * 2012-08-27 2014-03-12 联想(北京)有限公司 Information transmission method, information receiving method and electronic apparatus
CN103916620A (en) * 2013-01-04 2014-07-09 中国移动通信集团公司 Method and device for video call and mobile terminal
CN104601918A (en) * 2014-12-29 2015-05-06 小米科技有限责任公司 Video recording method and device
CN105812710A (en) * 2016-05-05 2016-07-27 广东小天才科技有限公司 Method and system for optimizing image quality in video call process

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7130472B2 (en) * 2002-01-21 2006-10-31 Canon Kabushiki Kaisha Image distribution apparatus, communication terminal apparatus, and control method thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101715112A (en) * 2009-12-14 2010-05-26 中兴通讯股份有限公司 Video monitoring system and method
CN103217937A (en) * 2012-05-31 2013-07-24 山东电力集团公司青岛供电公司 Power distribution station house monitoring system
CN103634556A (en) * 2012-08-27 2014-03-12 联想(北京)有限公司 Information transmission method, information receiving method and electronic apparatus
CN103916620A (en) * 2013-01-04 2014-07-09 中国移动通信集团公司 Method and device for video call and mobile terminal
CN104601918A (en) * 2014-12-29 2015-05-06 小米科技有限责任公司 Video recording method and device
CN105812710A (en) * 2016-05-05 2016-07-27 广东小天才科技有限公司 Method and system for optimizing image quality in video call process

Also Published As

Publication number Publication date
CN107426142A (en) 2017-12-01

Similar Documents

Publication Publication Date Title
CN107426142B (en) Image returning method, image storage method, node of Internet of things and server
Joris et al. An autonomous sigfox wireless sensor node for environmental monitoring
US10560622B2 (en) Adaptive trail cameras
Li et al. Practical deployment of an in-field soil property wireless sensor network
KR100835987B1 (en) Integrated sensor server system for agriculture
CN205102880U (en) All -weather monitoring system of gauge and device based on image
CN106817410B (en) Image returning method, image storage method, node of Internet of things and server
CN107911612B (en) Automatic focusing method and device for camera
CN103218904A (en) Hydrology data acquisition system base on WiFi
CN109391641A (en) Temperature information method for uploading, device, system, electronic equipment and storage medium
JP2011228884A (en) Imaging device and method for controlling imaging device
CN105139479B (en) A kind of driving recording method and system based on mobile terminal
CN103763360A (en) Intelligent agricultural system based on wireless communication ad hoc network
CN103292841A (en) Multifunctional telemetry terminal with solar powering function
CN105486350A (en) Movable farmland environment information monitoring system
Kapetanovic et al. Experiences deploying an always-on farm network
CN105580356A (en) Information processing apparatus and information processing method
CN208401996U (en) A kind of video camera
US20150338867A1 (en) Sensor apparatus and associated methods
CN103292793A (en) Multifunctional telemetering terminal with hydrological telemetering
CN109348447A (en) Regional plantation object soil quality real-time radio monitoring system
CN210982784U (en) Novel spontaneous emission tipping bucket rain gauge
CN214851579U (en) Farming condition monitoring camera device
CN202003497U (en) Production risk factor data acquisition equipment for agricultural products
CN201830297U (en) 3G remote image monitoring system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Image return method, image saving method, node and server of Internet of things

Effective date of registration: 20210930

Granted publication date: 20200421

Pledgee: China Co. truction Bank Corp Guangzhou economic and Technological Development Zone sub branch

Pledgor: AIRAG TECHNOLOGY Ltd.

Registration number: Y2021980010340

PE01 Entry into force of the registration of the contract for pledge of patent right