[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113029142A - Navigation method, navigation device and terminal equipment - Google Patents

Navigation method, navigation device and terminal equipment Download PDF

Info

Publication number
CN113029142A
CN113029142A CN202010453895.XA CN202010453895A CN113029142A CN 113029142 A CN113029142 A CN 113029142A CN 202010453895 A CN202010453895 A CN 202010453895A CN 113029142 A CN113029142 A CN 113029142A
Authority
CN
China
Prior art keywords
target
terminal
navigation
acquiring
identification picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010453895.XA
Other languages
Chinese (zh)
Inventor
邓立群
彭阳
李宛春
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Anso Measurement & Control Instruments Co ltd
Original Assignee
Shenzhen Anso Measurement & Control Instruments Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Anso Measurement & Control Instruments Co ltd filed Critical Shenzhen Anso Measurement & Control Instruments Co ltd
Priority to CN202010453895.XA priority Critical patent/CN113029142A/en
Publication of CN113029142A publication Critical patent/CN113029142A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The application is applicable to the technical field of data processing, and provides a navigation method, a navigation device and terminal equipment, wherein the navigation method comprises the following steps: acquiring a target position of target equipment and an initial position of a terminal; generating a navigation path based on the target location and the initial location; under the condition that the distance between the current position of the terminal and the target position is smaller than or equal to a threshold value, acquiring a target identification picture containing guide information; the target identification picture is a picture of an identifier positioned around the target equipment; and displaying the target identification picture on a navigation interface of the terminal. By the method and the device, when the staff arrives at the periphery of the target equipment, the target equipment can be found more quickly according to the marker and the guide information in the target identification picture.

Description

Navigation method, navigation device and terminal equipment
Technical Field
The application belongs to the technical field of data processing, and particularly relates to a navigation method, a navigation device and terminal equipment.
Background
The modern city water supply network is an important component of basic equipment required for ensuring the whole normal operation of a city, and the guarantee of the smoothness of the water supply network is very important. Wherein when equipment breaks down in the water supply network, go out the operation by the staff and maintain equipment, consequently confirm the concrete position of trouble equipment fast and can very big reduction repair time so that the water supply network resumes normal operating. At present, a worker determines the position of the fault equipment in a positioning navigation mode, and the worker cannot quickly determine the position of the fault equipment due to low positioning precision, so that the equipment fault repairing time is long.
Disclosure of Invention
In order to overcome the problems in the related art, embodiments of the present application provide a navigation method, an apparatus, and a terminal device.
The application is realized by the following technical scheme:
in a first aspect, an embodiment of the present application provides a navigation method, including:
acquiring a target position of target equipment and an initial position of a terminal;
generating a navigation path based on the target location and the initial location;
under the condition that the distance between the current position of the terminal and the target position is smaller than or equal to a threshold value, acquiring a target identification picture containing guide information; the target identification picture is a picture of an identifier positioned around the target equipment;
and displaying the target identification picture on a navigation interface of the terminal.
In a possible implementation manner of the first aspect, the obtaining a target location of a target device includes:
receiving an input identification of the target device;
determining a target service system corresponding to the identifier of the target equipment, and sending a position acquisition request for acquiring the position of the target equipment to the target service system;
and acquiring the target position of the target equipment sent by the server of the target service system.
In a possible implementation manner of the first aspect, the method further includes a step of obtaining a current location of the terminal;
the step of obtaining the current position of the terminal includes:
receiving a satellite positioning signal sent by a navigation satellite, and determining initial positioning data of the terminal based on the satellite positioning information;
acquiring differential data of a reference station from the reference station;
and correcting the initial positioning data according to the differential data to determine the current position of the terminal.
In a possible implementation manner of the first aspect, the method further includes:
acquiring an image of each marker of at least two markers located around each device;
acquiring input guide information associated with each image, wherein the guide information is guide information from the identifier to corresponding equipment;
an identification picture corresponding to the device is generated based on the associated image and the guidance information.
In a possible implementation manner of the first aspect, the acquiring an image of each of at least two markers located around each device includes:
the method comprises the steps of collecting images of at least two markers located in a preset range of each device, wherein each marker is located in different directions of the device.
In a possible implementation manner of the first aspect, each target identification picture corresponds to one direction of the target device, and acquiring a target identification picture including guidance information when a distance between a current position of the terminal and the target position is smaller than or equal to a threshold includes:
detecting the current distance between the current position of the terminal and the target position;
determining a target direction of the current position relative to the target position when the current distance is less than or equal to the threshold;
and acquiring a target identification picture corresponding to the target direction from each target identification picture corresponding to the target equipment.
In a possible implementation manner of the first aspect, the displaying the target identification picture on a navigation interface of the terminal includes:
and determining a target area on a navigation interface of the terminal, and displaying the target identification picture on the target area.
In a second aspect, an embodiment of the present application provides a navigation device, including:
the position acquisition module is used for acquiring a target position of the target equipment and an initial position of the terminal;
a navigation path generation module for generating a navigation path based on the target position and the initial position;
the image acquisition module is used for acquiring a target identification image containing guide information under the condition that the distance between the current position of the terminal and the target position is smaller than or equal to a threshold value; the target identification is an identifier located around the target device;
and the picture display module is used for displaying the target identification picture on a navigation interface of the terminal.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the navigation method according to any one of the first aspect is implemented.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the navigation method according to any one of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to execute the navigation method according to any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Compared with the prior art, the embodiment of the application has the advantages that:
according to the method and the device, the target position of the target device and the initial position of the terminal are obtained, the navigation path is generated based on the target position and the initial position, the target identification picture containing the guiding information based on the marker around the target device is obtained under the condition that the distance between the current position and the target position is smaller than or equal to the threshold value, and the target identification picture is displayed on the navigation interface of the terminal, so that when a worker reaches the periphery of the target device, the worker can more quickly find the target device according to the marker and the guiding information in the target identification picture.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the specification.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic view of an application scenario of a navigation method according to an embodiment of the present application;
FIG. 2 is a flow chart of a navigation method according to an embodiment of the present application;
FIG. 3 is a flow chart of a navigation method according to an embodiment of the present application;
FIG. 4 is a flow chart illustrating a navigation method according to an embodiment of the present application;
FIG. 5 is a flow chart illustrating a navigation method according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a target device and an identifier provided by an embodiment of the present application;
FIG. 7 is a flow chart illustrating a navigation method according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a navigation method according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of a navigation device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a tablet computer to which the navigation method provided in the embodiment of the present application is applied.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The modern city water supply network is an important component of basic equipment required for ensuring the whole normal operation of a city, and the guarantee of the smoothness of the water supply network is very important. Wherein when equipment breaks down in the water supply network, go out the operation by the staff and maintain equipment, consequently confirm the concrete position of trouble equipment fast and can very big reduction repair time so that the water supply network resumes normal operating. At present, a worker determines the position of the fault equipment in a positioning navigation mode, and the worker cannot quickly determine the position of the fault equipment due to low positioning precision, so that the equipment fault repairing time is long.
Based on the above problem, the navigation method in the embodiment of the present application obtains the target position of the target device and the initial position of the terminal, generates the navigation path based on the target position and the initial position, obtains the target identification picture including the guidance information based on the markers around the target device under the condition that the distance between the current position and the target position is less than or equal to the threshold, and displays the target identification picture on the navigation interface of the terminal, so that the staff can more quickly find the target device according to the markers and the guidance information in the target identification picture when reaching the periphery of the target device.
For example, the embodiment of the present application can be applied to the exemplary scenario shown in fig. 1. In this scenario, the terminal 10 determines the position of the terminal through the satellite positioning system 20 and the reference station 30, acquires the position of the target device through the server 40 of the service system, then generates a navigation path based on the position of the target device and the position of the terminal, acquires a target identification picture based on the identifier around the target device and containing the guidance information in the case that the distance between the position of the target device and the position of the terminal is less than or equal to a threshold value, and displays the target identification picture on a navigation interface of the terminal, so that when a worker arrives around the target device, the worker can more quickly find the target device according to the identifier and the guidance information in the target identification picture.
The visualization interaction method of the present application is described in detail below with reference to fig. 1.
Fig. 2 is a schematic flow chart of a navigation method provided in an embodiment of the present application, and with reference to fig. 2, the navigation method is described in detail as follows:
in step 101, a target position of a target device and an initial position of a terminal are acquired.
In one possible implementation manner, referring to fig. 3, the acquiring the target location of the target device in step 101 may include:
in step 1011, an input identification of the target device is received.
Wherein the user can enter in the program of the terminal an identification of the target device, which is used to uniquely characterize the target device. For example, the identifier may be a number, a letter, or a combination of a number and a letter, corresponding to a device in the business system.
In step 1012, a target service system corresponding to the identifier of the target device is determined, and a location obtaining request for obtaining the location of the target device is sent to the target service system.
The target service system may be one or more service systems that have a connection relationship with a program of the terminal in advance, and for a plurality of service systems, the service system corresponding to the identifier may be determined according to the identifier of the target device, for example, the identifier of the device corresponding to each service system may include a part corresponding to the service system.
For example, after determining the target service system corresponding to the identifier of the target device, a location obtaining request for obtaining the location of the target device may be sent to the target service system. For example, the location retrieval request may be sent to the server of the target service system in the form of HTTP. The server can verify the validity of the request based on the position acquisition request, for example, whether the request has corresponding authority or not, and sends the position data of the target device to the terminal after the verification is passed.
In step 1013, the target location of the target device sent by the server of the target service system is obtained.
For example, after receiving the location data of the target device returned by the server, the location data may be decoded, etc., to obtain the target location of the target device.
In a possible implementation manner, the acquiring the initial position of the terminal in step 101 may include:
step A1, receiving satellite positioning signals sent by a navigation satellite, and determining initial position data of the terminal based on the satellite positioning signals;
a step a2 of acquiring differential data of a reference station from the reference station;
step a3, correcting the initial position data according to the difference data, and determining the initial position of the terminal.
The navigation satellite continuously transmits radio signals to carry out navigation and positioning for users on the earth. The current global navigation satellite system includes the united states global positioning system, the former soviet union/russia global navigation satellite system, the european space agency galileo satellite positioning system and the chinese beidou navigation satellite positioning system. In specific use, the satellite positioning signals transmitted by the navigation satellites of the corresponding navigation satellite system are received according to the mode supported by the system of the mobile terminal.
The reference station is a fixed observation station that continuously observes the satellite positioning signal for a long time. The coordinate position of the reference station is known, the reference station continuously receives satellite positioning signals, the measured position or distance data is compared with the known position and distance data, and reference station differential data are obtained through calculation, wherein the reference station differential data can be ranging errors, carrier phase correction quantities and the like. The reference station differential information is reference station differential data carrying transmission time.
When the terminal requests the reference station differential information from the reference station, the reference station sends the differential data of the reference station to the terminal for the terminal to correct the positioning result. The mobile terminal can realize information transmission with the reference station only by keeping communication connection with the reference station, so that the real-time accurate positioning of the terminal is realized.
For example, for step a1, the above-mentioned determining the initial positioning data of the terminal based on the satellite positioning information may specifically be: initial positioning data from the terminal to the navigation satellite is obtained through analysis of the satellite positioning signal, the initial positioning data can be a pseudo range or a carrier phase measurement value, and the initial positioning data contains errors caused by factors such as satellite clock error, troposphere delay, ionosphere delay and gravitational field effect.
For step a3, the initial positioning data is corrected, for example, pseudo-range difference is performed on pseudo-range, or carrier phase difference is performed on carrier phase, so as to eliminate common errors (such as errors caused by satellite clock error, tropospheric delay, ionospheric delay, gravitational field effect, and the like), and obtain accurate positioning data, thereby achieving accurate positioning of the terminal.
In step 102, a navigation path is generated based on the target position and the initial position.
And generating a navigation path from the target position to the initial position according to the target position, the initial position and the current road traffic information.
For example, a plurality of navigation paths may be determined according to the target position and the initial position, the time required for each navigation path may be determined based on the current road traffic information, and the navigation path with the shortest time may be selected to be displayed on the navigation interface of the terminal for the user to view.
In some embodiments, after step 102, the navigation method may further include the step of acquiring the current location of the terminal.
In the navigation process, the current position of the terminal can be determined at regular intervals, and navigation is performed for the user based on the current position. For example, whether the terminal deviates from the navigation path is detected according to the current position, and the distance and remaining time of the terminal from the target device are detected according to the current position.
Referring to fig. 4, the step of acquiring the current location of the terminal may include:
in step 1051, a satellite positioning signal transmitted by a navigation satellite is received, and initial positioning data of the terminal is determined based on the satellite positioning signal.
The navigation satellite continuously transmits radio signals to carry out navigation and positioning for users on the earth. The current global navigation satellite system includes the united states global positioning system, the former soviet union/russia global navigation satellite system, the european space agency galileo satellite positioning system and the chinese beidou navigation satellite positioning system. In specific use, the satellite positioning signals transmitted by the navigation satellites of the corresponding navigation satellite system are received according to the mode supported by the system of the mobile terminal.
For example, the determining the initial positioning data of the terminal based on the satellite positioning information may specifically be: initial positioning data from the terminal to the navigation satellite is obtained through analysis of the satellite positioning signal, the initial positioning data can be a pseudo range or a carrier phase measurement value, and the initial positioning data contains errors caused by factors such as satellite clock error, troposphere delay, ionosphere delay and gravitational field effect.
In step 1052, differential data of the reference station is acquired from the reference station.
The reference station is a fixed observation station for continuously observing the satellite positioning signal for a long time. The coordinate position of the reference station is known, the reference station continuously receives satellite positioning signals, the measured position or distance data is compared with the known position and distance data, and reference station differential data are obtained through calculation, wherein the reference station differential data can be ranging errors, carrier phase correction quantities and the like. The reference station differential information is reference station differential data carrying transmission time.
When the terminal requests the reference station differential information from the reference station, the reference station sends the differential data of the reference station to the terminal for the terminal to correct the positioning result. The mobile terminal can realize information transmission with the reference station only by keeping communication connection with the reference station, so that the real-time accurate positioning of the terminal is realized.
In step 1053, the initial positioning data is corrected according to the difference data, and the current position of the terminal is determined.
The initial positioning data is corrected, for example, pseudo range difference is performed on pseudo range, or carrier phase difference is performed on carrier phase, so that common errors (errors caused by factors such as satellite clock error, troposphere delay, ionosphere delay, gravitational field effect and the like) are eliminated, accurate positioning data is obtained, and accurate positioning of the terminal is achieved.
In addition, in this step, determining the current position of the terminal according to the corrected positioning data is a known prior art by those skilled in the art, and this embodiment is not described herein again.
In step 103, in a case that a distance between the current position of the terminal and the target position is less than or equal to a threshold, a target identification picture including guidance information is acquired.
The target identification picture is a picture of an identifier located around the target device. For example, the identifier may be a highly identifiable object, structure, etc. located around the target device. Wherein, the higher degree of identification includes but is not limited to bright color, high shape identification, position and other factors.
In one embodiment, the identifier may be an object, a building, etc. located in an area that cannot be covered by the satellite positioning signal, but is not limited thereto. Based on the specificity of the location of the water supply network equipment, the satellite positioning signals are generally poor, and therefore navigation cannot be accurately performed when the target location is approached, so that a worker cannot quickly find out the target equipment with faults. Based on the method, the target identification picture containing the guiding information can be set, so that the staff can be guided to quickly find the target equipment with the fault.
For example, each target identification picture may correspond to one direction of the target device, and at least one target representation picture may be set for each target device. Wherein each object is a picture corresponding to an identifier, as shown in fig. 5. In fig. 5, four markers 61 to 64 corresponding to the target device 50 are illustrated as an example. Specifically, the four markers 61 to 64 may be located around the target device (for example, in a circle with the target device as a center and a preset distance as a radius), and the directions from the target device 50 to the markers are different. For example, a first identifier 61 is located in a first orientation of the target device 50, a second identifier 62 is located in a second orientation of the target device 50, a third identifier 63 is located in a third orientation of the target device 50, and a fourth identifier 64 is located in a fourth orientation of the target device 50.
It should be noted that fig. 5 is only an exemplary illustration, and in other embodiments, the target device may correspond to 1 identifier, 2 identifiers, 3 identifiers, 5 or more identifiers, and the like, which is not limited in this embodiment of the present application.
Referring to fig. 6, step 103 may include:
in step 1031, a current distance between the current position of the terminal and the target position is detected.
The current position of the terminal may be determined once every preset time, and then the current distance between the target position and the current position of the terminal may be determined according to the target position and the current position of the terminal. For example, the current location of the terminal may be determined at regular intervals according to the schemes corresponding to step 1051 to step 1053.
In step 1032, a target direction of the current position relative to the target position is determined when the current distance is less than or equal to the threshold.
The threshold may be a small value, for example, 10 meters, 5 meters, 1 meter, etc., may be slightly greater than the accuracy of satellite positioning, and may be specifically set according to actual needs.
Specifically, when the current distance between the terminal and the target device is less than or equal to the threshold, indicating that the worker has reached the vicinity of the target device, the satellite positioning signal is generally poor based on the specificity of the location of the water supply network device, and therefore navigation is generally not accurately performed when approaching the target location, resulting in the worker not being able to quickly find the target device with a fault. Based on the method, the staff can be guided to quickly find the target equipment with the fault through the target identification picture.
For example, for the target device, the target identification pictures may be one or more, and the identifier in each target identification picture is located around the target device, so that the staff can quickly find the target device from each direction of the target device based on the target identification picture. Therefore, the target direction of the current position of the terminal relative to the target position of the target device can be determined, and then the corresponding target representation picture is determined based on the target direction.
In step 1033, a target identification picture corresponding to the target direction is obtained from each target identification picture corresponding to the target device.
Referring to fig. 5, for example, when the current location of the terminal is located in the first direction of the target device, a target identification picture including an identifier corresponding to the first direction may be obtained; for example, when the current location of the terminal is located between the first direction and the second direction of the target device, a target identification picture including the identifier corresponding to the first direction may be acquired, or a target identification picture including the identifier corresponding to the second direction may be acquired.
For example, each target identification picture of the target device may correspond to a direction, and the direction may be a direction of the identifier in the target identification picture relative to the target device. In this embodiment, the direction of each target identification picture relative to the target device may be preset.
In step 104, the target identification picture is displayed on a navigation interface of the terminal.
Illustratively, the step 104 may specifically be: and determining a target area on a navigation interface of the terminal, and displaying the target identification picture on the target area. The staff can quickly find the target equipment based on the marker and the guide information in the target identification picture.
Referring to fig. 7, in some embodiments, based on the embodiment shown in fig. 2, the navigation method may further include:
in step 106, an image of each of at least two markers located around each device is acquired.
Illustratively, the step 106 may specifically be:
the method comprises the steps of collecting images of at least two markers located in a preset range of each device, wherein each marker is located in different directions of the device.
The marker can be an object, a building and the like which are located around the equipment and have high identification degree. The identification degree includes, but is not limited to, color, shape, position, etc. the identification degree of the marker is determined according to the color, shape, position, etc.
In step 107, the input guide information associated with each image is acquired, wherein the guide information is the guide information from the identifier to the corresponding device.
In step 108, an identification picture corresponding to the device is generated based on the associated image and the guidance information.
Wherein the guiding message can be inputted by the user, and the guiding message includes but is not limited to a text guiding, an indication arrow guiding, and the like. For example, the terminal may perform a long-range view and a close-range view photographing on the water supply network device, add a marker to the image and guide information such as an indication arrow icon and/or text guidance of the device in combination with a picture editing function, and then generate an identification picture corresponding to the device from the associated image and the guide information.
Fig. 8 is a schematic flow chart of a navigation method provided in an embodiment of the present application, and with reference to fig. 8, the navigation method is described in detail as follows:
in step 201, receiving an input identification of the target device;
in step 202, determining a target service system corresponding to the identifier of the target device, and sending a location acquisition request for acquiring the location of the target device to the target service system;
in step 203, acquiring a target location of the target device sent by a server of the target service system;
in step 204, an initial position of the terminal is acquired.
In step 205, a navigation path is generated based on the target position and the initial position.
In step 206, a satellite positioning signal transmitted by a navigation satellite is received, and initial positioning data of the terminal is determined based on the satellite positioning information.
In step 207, differential data of the reference station is acquired from the reference station.
In step 208, the initial positioning data is corrected according to the difference data, and the current position of the terminal is determined.
In step 209, a current distance between the current location of the terminal and the target location is detected.
In step 210, a target direction of the current position relative to the target position is determined when the current distance is less than or equal to the threshold.
In step 211, a target identification picture corresponding to the target direction is obtained from each target identification picture corresponding to the target device.
In step 212, the target identification picture is displayed on a navigation interface of the terminal.
According to the navigation method, the target position of the target equipment and the initial position of the terminal are obtained, the navigation path is generated based on the target position and the initial position, the target identification picture containing the guide information based on the markers around the target equipment is obtained under the condition that the distance between the current position and the target position is smaller than or equal to the threshold value, and the target identification picture is displayed on the navigation interface of the terminal, so that when a worker reaches the periphery of the target equipment, the worker can more quickly find the target equipment according to the markers and the guide information in the target identification picture.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 9 shows a block diagram of a navigation device provided in an embodiment of the present application, corresponding to the navigation method described in the above embodiment, and only the relevant parts to the embodiment of the present application are shown for convenience of description.
Referring to fig. 9, the navigation apparatus in the embodiment of the present application may include a position acquisition module 301, a navigation path generation module 302, a picture acquisition module 303, and a picture display module 304.
The position acquiring module 301 is configured to acquire a target position of a target device and an initial position of a terminal;
a navigation path generating module 302, configured to generate a navigation path based on the target location and the initial location;
a picture obtaining module 303, configured to obtain a target identification picture including guidance information when a distance between the current position of the terminal and the target position is smaller than or equal to a threshold; wherein the target identifier is an identifier located around the target device;
a picture display module 304, configured to display the target identification picture on a navigation interface of the terminal.
Optionally, the position obtaining module 301 may be specifically configured to:
receiving an input identification of the target device;
determining a target service system corresponding to the identifier of the target equipment, and sending a position acquisition request for acquiring the position of the target equipment to the target service system;
and acquiring the target position of the target equipment sent by the server of the target service system.
Optionally, the apparatus may further include a current location obtaining module configured to obtain a current location of the terminal; the current position obtaining module may be specifically configured to:
receiving a satellite positioning signal sent by a navigation satellite, and determining initial positioning data of the terminal based on the satellite positioning information;
acquiring differential data of a reference station from the reference station;
and correcting the initial positioning data according to the differential data to determine the current position of the terminal.
Optionally, the apparatus may further include:
the image acquisition module is used for acquiring an image of each marker in at least two markers positioned around each device;
the guide information acquisition module is used for acquiring input guide information associated with each image, wherein the guide information is the guide information from the identifier to the corresponding equipment;
and the identification picture generation module is used for generating an identification picture corresponding to the equipment based on the associated image and the guide information.
Optionally, the image acquisition module may be specifically configured to:
the method comprises the steps of collecting images of at least two markers located in a preset range of each device, wherein each marker is located in different directions of the device.
Optionally, each target identification picture corresponds to one direction of the target device, and the picture obtaining module 303 may be specifically configured to:
detecting the current distance between the current position of the terminal and the target position;
determining a target direction of the current position relative to the target position when the current distance is less than or equal to the threshold;
and acquiring a target identification picture corresponding to the target direction from each target identification picture corresponding to the target equipment.
Optionally, the picture display module 304 may be specifically configured to: and determining a target area on a navigation interface of the terminal, and displaying the target identification picture on the target area.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a terminal device, and referring to fig. 10, the terminal device 400 may include: at least one processor 410, a memory 420, and a computer program stored in the memory 420 and executable on the at least one processor 410, wherein the processor 410 when executing the computer program implements the steps of any of the method embodiments described above, such as the steps S101 to S104 in the embodiment shown in fig. 2. Alternatively, the processor 510, when executing the computer program, implements the functions of the modules/units in the above-described device embodiments, such as the functions of the modules 301 to 304 shown in fig. 9.
Illustratively, a computer program may be partitioned into one or more modules/units, which are stored in the memory 420 and executed by the processor 410 to accomplish the present application. The one or more modules/units may be a series of computer program segments capable of performing specific functions, which are used to describe the execution of the computer program in the terminal device 400.
Those skilled in the art will appreciate that fig. 10 is merely an example of a terminal device and is not limiting of terminal devices and may include more or fewer components than shown, or some components may be combined, or different components such as input output devices, network access devices, buses, etc.
The Processor 410 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 420 may be an internal storage unit of the terminal device, or may be an external storage device of the terminal device, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. The memory 420 is used for storing the computer programs and other programs and data required by the terminal device. The memory 420 may also be used to temporarily store data that has been output or is to be output.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The navigation method provided by the embodiment of the application can be applied to terminal devices such as a computer, a wearable device, a vehicle-mounted device, a tablet computer, a notebook computer, a netbook, a Personal Digital Assistant (PDA), an Augmented Reality (AR)/Virtual Reality (VR) device, and a mobile phone, and the specific type of the terminal device is not limited at all in the embodiment of the application.
Take the terminal device as a tablet computer as an example. Fig. 11 is a block diagram illustrating a partial structure of a tablet computer provided in an embodiment of the present application. Referring to fig. 11, the tablet computer includes: a communication circuit 510, a memory 520, an input unit 530, a display unit 540, an audio circuit 550, a wireless fidelity (WiFi) module 560, a processor 570, and a power supply 580. Those skilled in the art will appreciate that the smart sound box configuration shown in fig. 11 does not constitute a limitation of a tablet computer, and may include more or fewer components than those shown, or some components in combination, or a different arrangement of components.
The following specifically describes each component of the smart speaker with reference to fig. 11:
the communication circuit 510 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, receives an image sample transmitted by the image capturing device and then processes the image sample to the processor 570; in addition, the image acquisition instruction is sent to the image acquisition device. Typically, the communication circuit includes, but is not limited to, an antenna, at least one Amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, the communication circuit 510 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to Global System for Mobile communication (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE)), e-mail, Short Messaging Service (SMS), and the like.
The memory 520 may be used to store software programs and modules, and the processor 570 executes various functional applications and data processing of the smart sound box by operating the software programs and modules stored in the memory 520. The memory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the smart speaker, and the like. Further, the memory 520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the smart speaker. Specifically, the input unit 530 may include a touch panel 531 and other input devices 532. The touch panel 531, also called a touch screen, can collect touch operations of a user on or near the touch panel 531 (for example, operations of the user on or near the touch panel 531 by using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 570, and can receive and execute commands sent by the processor 570. In addition, the touch panel 531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. Other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 540 may be used to display information input by the user or information provided to the user and various menus of the smart speaker, and to project the avatar model of the target user transmitted from other smart speakers. The Display unit 540 may include a Display panel 541 and a projection device, and optionally, the Display panel 541 may be configured by a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 531 may cover the display panel 541, and when the touch panel 531 detects a touch operation on or near the touch panel 531, the touch panel is transmitted to the processor 570 to determine the type of the touch event, and then the processor 570 provides a corresponding visual output on the display panel 541 according to the type of the touch event. Although in fig. 11, the touch panel 531 and the display panel 541 are two independent components to implement the input and output functions of the smart speaker, in some embodiments, the touch panel 531 and the display panel 541 may be integrated to implement the input and output functions of the smart speaker.
Audio circuitry 550 may provide an audio interface between the user and the smart speaker. The audio circuit 550 may transmit the received electrical signal converted from the audio data to a speaker, and convert the electrical signal into a sound signal for output; on the other hand, the microphone converts the collected sound signal into an electrical signal, which is received by the audio circuit 550 and converted into audio data, which is then processed by the audio data output processor 570 and sent to, for example, another smart speaker via the communication circuit 510, or output to the memory 520 for further processing.
WiFi belongs to short-distance wireless transmission technology, and the smart speaker can help a user to receive and send e-mails, browse webpages, access streaming media and the like through the WiFi module 560, and provides wireless broadband internet access for the user. Although fig. 11 shows the WiFi module 560, it is understood that it does not belong to the essential constitution of the smart speaker and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 570 is a control center of the smart speaker, and is connected to various parts of the smart speaker through various interfaces and lines, and executes various functions and processes data of the smart speaker by operating or executing software programs and/or modules stored in the memory 520 and calling data stored in the memory 520, thereby integrally monitoring the smart speaker. Optionally, processor 570 may include one or more processing units; optionally, processor 570 may integrate an application processor, which handles primarily the operating system, user interface, applications, etc., and a modem processor, which handles primarily wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 570.
The smart speaker further includes a power supply 580 (e.g., a battery) for supplying power to various components, wherein the power supply 580 may be logically connected to the processor 570 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, although not shown, the smart speaker may further include a bluetooth module, etc., which is not described herein again.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the embodiments of the navigation method described above.
The embodiment of the present application provides a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the embodiments of the navigation method when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A navigation method, comprising:
acquiring a target position of target equipment and an initial position of a terminal;
generating a navigation path based on the target location and the initial location;
under the condition that the distance between the current position of the terminal and the target position is smaller than or equal to a threshold value, acquiring a target identification picture containing guide information; the target identification picture is a picture of an identifier positioned around the target equipment;
and displaying the target identification picture on a navigation interface of the terminal.
2. The navigation method of claim 1, wherein the obtaining the target location of the target device comprises:
receiving an input identification of the target device;
determining a target service system corresponding to the identifier of the target equipment, and sending a position acquisition request for acquiring the position of the target equipment to the target service system;
and acquiring the target position of the target equipment sent by the server of the target service system.
3. The navigation method according to claim 1, further comprising the step of acquiring a current location of the terminal;
the step of obtaining the current position of the terminal includes:
receiving a satellite positioning signal sent by a navigation satellite, and determining initial positioning data of the terminal based on the satellite positioning signal;
acquiring differential data of a reference station from the reference station;
and correcting the initial positioning data according to the differential data to determine the current position of the terminal.
4. The navigation method of claim 1, wherein the method further comprises:
acquiring an image of each marker of at least two markers located around each device;
acquiring input guide information associated with each image, wherein the guide information is guide information from the identifier to corresponding equipment;
an identification picture corresponding to the device is generated based on the associated image and the guidance information.
5. The navigation method of claim 4, wherein said capturing images of each of at least two markers located around each device comprises:
the method comprises the steps of collecting images of at least two markers located in a preset range of each device, wherein each marker is located in different directions of the device.
6. The navigation method according to any one of claims 1 to 5, wherein each target identification picture corresponds to one direction of the target device, and the obtaining of the target identification picture containing the guidance information in the case that the distance between the current position of the terminal and the target position is less than or equal to a threshold value comprises:
detecting the current distance between the current position of the terminal and the target position;
determining a target direction of the current position relative to the target position when the current distance is less than or equal to the threshold;
and acquiring a target identification picture corresponding to the target direction from each target identification picture corresponding to the target equipment.
7. The navigation method according to any one of claims 1 to 5, wherein the displaying the target identification picture on the navigation interface of the terminal comprises:
and determining a target area on a navigation interface of the terminal, and displaying the target identification picture on the target area.
8. A navigation device, comprising:
the position acquisition module is used for acquiring a target position of the target equipment and an initial position of the terminal;
a navigation path generation module for generating a navigation path based on the target position and the initial position;
the image acquisition module is used for acquiring a target identification image containing guide information under the condition that the distance between the current position of the terminal and the target position is smaller than or equal to a threshold value; wherein the target identifier is an identifier located around the target device;
and the picture display module is used for displaying the target identification picture on a navigation interface of the terminal.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202010453895.XA 2020-05-26 2020-05-26 Navigation method, navigation device and terminal equipment Pending CN113029142A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010453895.XA CN113029142A (en) 2020-05-26 2020-05-26 Navigation method, navigation device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010453895.XA CN113029142A (en) 2020-05-26 2020-05-26 Navigation method, navigation device and terminal equipment

Publications (1)

Publication Number Publication Date
CN113029142A true CN113029142A (en) 2021-06-25

Family

ID=76458607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010453895.XA Pending CN113029142A (en) 2020-05-26 2020-05-26 Navigation method, navigation device and terminal equipment

Country Status (1)

Country Link
CN (1) CN113029142A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060004512A1 (en) * 2004-06-30 2006-01-05 Herbst James M Method of operating a navigation system using images
CN104897165A (en) * 2014-03-06 2015-09-09 苏州工业园区新国大研究院 Shot scenery-based navigation method and system thereof
US20160146614A1 (en) * 2014-11-25 2016-05-26 Wal-Mart Stores, Inc. Computer vision navigation
CN108592939A (en) * 2018-07-11 2018-09-28 维沃移动通信有限公司 A kind of air navigation aid and terminal
CN109029488A (en) * 2018-06-29 2018-12-18 百度在线网络技术(北京)有限公司 Navigating electronic map generating method, equipment and storage medium
CN109543800A (en) * 2018-12-30 2019-03-29 湖北知本信息科技有限公司 Constant mark object automatic identification controlling terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060004512A1 (en) * 2004-06-30 2006-01-05 Herbst James M Method of operating a navigation system using images
CN104897165A (en) * 2014-03-06 2015-09-09 苏州工业园区新国大研究院 Shot scenery-based navigation method and system thereof
US20160146614A1 (en) * 2014-11-25 2016-05-26 Wal-Mart Stores, Inc. Computer vision navigation
CN109029488A (en) * 2018-06-29 2018-12-18 百度在线网络技术(北京)有限公司 Navigating electronic map generating method, equipment and storage medium
CN108592939A (en) * 2018-07-11 2018-09-28 维沃移动通信有限公司 A kind of air navigation aid and terminal
CN109543800A (en) * 2018-12-30 2019-03-29 湖北知本信息科技有限公司 Constant mark object automatic identification controlling terminal

Similar Documents

Publication Publication Date Title
US10928218B2 (en) Map information management and correction of geodata
CN110764111B (en) Conversion method, device, system and medium of radar coordinates and geodetic coordinates
US10061854B2 (en) Trusted maps: updating map locations using trust-based social graphs
CN103852773B (en) A kind of alignment system based on cloud computing technology and localization method
CN105554876B (en) A kind of mobile terminal locating method and mobile terminal
US20080076449A1 (en) Mobile communication terminal for receiving position information service and method thereof
CN109541655A (en) A kind of differential position system, method
CN101950027A (en) Navigational satellite signal receiving module and information processing method applied to same
CN110958059B (en) Testing device, system and method of satellite receiver
CN112232801A (en) Electronic transaction method and terminal
CN108362310B (en) Method and device for determining geomagnetic accuracy, storage medium and terminal
US8223068B2 (en) Method and system for logging position data
CN114035216A (en) Positioning method, device, equipment and storage medium
CN113029142A (en) Navigation method, navigation device and terminal equipment
US9037408B2 (en) Systems and methods for providing variable position precision
CN106658670B (en) A kind of Wireless Fidelity Wi-Fi scan method and mobile terminal
CN110691318B (en) Positioning method, positioning device, electronic equipment and computer storage medium
CN105806304A (en) Measuring method and apparatus for direction angle of antenna
CN104154923A (en) Apparatus and method for positioning signal based road matching
CN108036795B (en) Path acquisition method and device and mobile terminal
EP2889582A1 (en) Information management system
US20150038175A1 (en) Automatic Location Address Translation and Less Accurate Location Technology Address to More Accurate Location Technology Address Translation
CN109446433A (en) A kind of interest point failure method of calibration, device, server and storage medium
CN205787135U (en) GPS static observation quality of data real-time detecting system
CN112348379B (en) Information processing method, information processing device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 518000 3203, block D, building 1, Section 1, Chuangzhi Yuncheng, Liuxian Avenue, Xili community, Xili street, Nanshan District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Tuoan Trust Internet of Things Co.,Ltd.

Address before: 518000 Tangtou community, Shiyan street, Bao'an District, Shenzhen City, Guangdong Province

Applicant before: SHENZHEN ANSO MEASUREMENT & CONTROL INSTRUMENTS CO.,LTD.

CB02 Change of applicant information