WO2024131760A1 - 移动性管理方法、装置、通信设备及可读存储介质 - Google Patents
移动性管理方法、装置、通信设备及可读存储介质 Download PDFInfo
- Publication number
- WO2024131760A1 WO2024131760A1 PCT/CN2023/139736 CN2023139736W WO2024131760A1 WO 2024131760 A1 WO2024131760 A1 WO 2024131760A1 CN 2023139736 W CN2023139736 W CN 2023139736W WO 2024131760 A1 WO2024131760 A1 WO 2024131760A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measurement
- node
- perception
- sensing
- target
- Prior art date
Links
- 238000004891 communication Methods 0.000 title claims abstract description 59
- 238000007726 management method Methods 0.000 title claims abstract description 36
- 238000005259 measurement Methods 0.000 claims abstract description 614
- 238000000034 method Methods 0.000 claims abstract description 97
- 238000007499 fusion processing Methods 0.000 claims abstract description 15
- 230000008447 perception Effects 0.000 claims description 438
- 230000006870 function Effects 0.000 claims description 84
- 230000000007 visual effect Effects 0.000 claims description 38
- 230000004044 response Effects 0.000 claims description 29
- 238000003384 imaging method Methods 0.000 claims description 27
- 230000015654 memory Effects 0.000 claims description 25
- 230000001133 acceleration Effects 0.000 claims description 21
- 238000004364 calculation method Methods 0.000 claims description 16
- 239000000463 material Substances 0.000 claims description 12
- 239000000203 mixture Substances 0.000 claims description 12
- 230000009471 action Effects 0.000 claims description 6
- 230000004927 fusion Effects 0.000 abstract description 9
- 238000001514 detection method Methods 0.000 description 20
- 238000012545 processing Methods 0.000 description 19
- 238000001228 spectrum Methods 0.000 description 19
- 230000008569 process Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 230000000694 effects Effects 0.000 description 13
- 230000005540 biological transmission Effects 0.000 description 11
- 230000010354 integration Effects 0.000 description 11
- 230000033001 locomotion Effects 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 6
- 238000001914 filtration Methods 0.000 description 6
- 230000000737 periodic effect Effects 0.000 description 6
- 230000008901 benefit Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 238000010295 mobile communication Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000005070 sampling Methods 0.000 description 4
- 230000001360 synchronised effect Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 208000035239 Synesthesia Diseases 0.000 description 3
- 238000013461 design Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 2
- 238000013523 data management Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 238000002592 echocardiography Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000000977 initiatory effect Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000010287 polarization Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000029058 respiratory gaseous exchange Effects 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 1
- 101000827703 Homo sapiens Polyphosphoinositide phosphatase Proteins 0.000 description 1
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 1
- 101150071746 Pbsn gene Proteins 0.000 description 1
- 102100023591 Polyphosphoinositide phosphatase Human genes 0.000 description 1
- 101100012902 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) FIG2 gene Proteins 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001976 improved effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 239000013618 particulate matter Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000011664 signaling Effects 0.000 description 1
- 238000005549 size reduction Methods 0.000 description 1
- 230000003860 sleep quality Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000008093 supporting effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W36/00—Hand-off or reselection arrangements
- H04W36/0005—Control or signalling for completing the hand-off
- H04W36/0055—Transmission or use of information for re-establishing the radio link
- H04W36/0058—Transmission of hand-off measurement information, e.g. measurement reports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/08—Testing, supervising or monitoring using real traffic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W24/00—Supervisory, monitoring or testing arrangements
- H04W24/10—Scheduling measurement reports ; Arrangements for measurement reports
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W36/00—Hand-off or reselection arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W36/00—Hand-off or reselection arrangements
- H04W36/16—Performing reselection for specific purposes
- H04W36/18—Performing reselection for specific purposes for allowing seamless reselection, e.g. soft reselection
Definitions
- the present application belongs to the field of communication technology, and specifically relates to a mobility management method, apparatus, communication equipment and readable storage medium.
- Perception capabilities refer to one or more devices with perception capabilities that can sense the direction, distance, speed and other information of target objects through the transmission and reception of wireless signals, or detect, track, identify and image target objects, events or environments.
- the embodiments of the present application provide a mobility management method, apparatus, communication device and readable storage medium to solve the problem of how to maintain the continuity of perception services.
- a mobility management method comprising:
- the first node performs sensing measurement and obtains a measurement result
- the first node sends a measurement report to the second node, where the measurement report includes the measurement result, and the measurement result is used to determine whether to perform perception switching;
- the measurement result includes at least one of the following: a measurement quantity perceived by a sensor; and a fused measurement result, wherein the fused measurement result is obtained by fusion processing of the measurement quantity perceived by the sensor and the measurement quantity perceived by the synaesthesia class.
- a mobility management method including:
- the second node receives a measurement report sent by the first node, where the measurement report includes a measurement result
- the second node sends first request information, where the first request information is used to request the third node to perform perception;
- the second node receives first response information sent by the third node, where the first response information is used to indicate that the third node agrees to perform sensing;
- the second node sends a switching command to the third node, where the switching command is used to instruct the third node to perform sensing as a target sensing node;
- the measurement result includes at least one of the following: a measurement quantity perceived by a sensor; and a fused measurement result, wherein the fused measurement result is obtained by fusion processing of the measurement quantity perceived by the sensor and the measurement quantity perceived by the synaesthesia class.
- a measurement module used for the first node to perform perception measurement and obtain a measurement result
- the first sending module is used to send a measurement report to the second node, where the measurement report includes the measurement result, and the measurement result is used to determine whether to perform perception switching; wherein the measurement result includes at least one of the following: a measurement quantity perceived by the sensor; a fused measurement result, where the fused measurement result is obtained by fusion processing of the measurement quantity perceived by the sensor and the measurement quantity perceived by the synaesthesia class.
- a mobility management device including:
- a third receiving module configured to receive a measurement report sent by the first node, where the measurement report includes a measurement result
- a third receiving module is used to receive first response information sent by a third node, where the first response information is used to indicate that the third node agrees to perform the sensing;
- a fourth sending module configured to send a switching command to the third node, wherein the switching command is used to instruct the third node to perform sensing as a target sensing node;
- a communication device comprising: a processor, a memory, and a program or instruction stored in the memory and executable on the processor, wherein the program or instruction, when executed by the processor, implements the steps of the method described in the first aspect or the second aspect.
- a chip comprising a processor and a communication interface, wherein the communication interface is coupled to the processor, and the processor is used to run a program or instruction to implement the steps of the method described in the first aspect or the second aspect.
- a computer program/program product is provided, wherein the computer program/program product is stored in a non-volatile storage medium, and the computer program/program product is executed by at least one processor to implement the steps of the method described in the first aspect or the second aspect.
- a communication system comprising a terminal and a network side device, the terminal being used to execute the steps of the method described in the first aspect, and the network side device being used to execute the steps of the method described in the first aspect or the second aspect.
- a first node performs a perception measurement to obtain a measurement result; the first node sends a measurement report to a second node, the measurement report includes the measurement result, the measurement result is used to determine whether to perform perception switching, and the measurement result includes at least one of the following: a measurement amount perceived by a sensor; a fused measurement result, the fused measurement result is obtained by fusion processing based on the measurement amount perceived by the sensor and the measurement amount perceived by the synaesthesia class, to achieve Perception mobility management based on sensor perception can ensure the continuity of perception services and improve the user experience of perception services.
- FIG2 is a schematic diagram of a switching process
- Figure 3 is a schematic diagram of perceptual classification
- FIG4 is a schematic diagram of a mobility management method provided by an embodiment of the present application.
- FIG5 is a schematic diagram of a mobility management method provided by another embodiment of the present application.
- FIG. 7 is a schematic diagram of a source base station and a source UE performing uplink sensing, and switching to a target base station and a target UE performing uplink sensing;
- FIG8 is a schematic diagram of a mobility management method provided by yet another embodiment of the present application.
- FIG9 is a schematic diagram of one-dimensional SNR calculation
- FIG10 is a structural diagram of a mobility management device provided by an embodiment of the present application.
- FIG11 is a structural diagram of a mobility management device provided by another embodiment of the present application.
- FIG12 is a schematic diagram of a terminal provided in an embodiment of the present application.
- FIG. 13 is a schematic diagram of a network-side device provided in an embodiment of the present application.
- first, second, etc. in the specification and claims of the present application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It should be understood that the terms used in this way are interchangeable under appropriate circumstances, so that the embodiments of the present application can be implemented in an order other than those illustrated or described here, and the objects distinguished by “first” and “second” are generally of the same type, and the number of objects is not limited.
- the first object can be one or more.
- “and/or” in the specification and claims represents at least one of the connected objects, and the character “/" generally represents that the objects associated with each other are in an "or” relationship.
- LTE Long Term Evolution
- LTE-A Long Term Evolution
- CDMA Code Division Multiple Access
- TDMA Time Division Multiple Access
- FDMA Frequency Division Multiple Access
- OFDMA Orthogonal Frequency Division Multiple Access
- SC-FDMA Single-carrier Frequency Division Multiple Access
- NR New Radio
- 6G 6th Generation
- Perception capability refers to the ability of one or more devices with perception capabilities to perceive the direction, distance, speed and other information of target objects through the transmission and reception of wireless signals, or to detect, track, identify and image target objects, events or environments.
- the perception resolution will be significantly improved compared to centimeter waves, enabling 6G networks to provide more sophisticated perception services.
- Typical perception functions and application scenarios are shown in Table 1.
- Table 1 Typical perception functions and application scenarios.
- Communication and perception integration means realizing the integrated design of communication and perception functions through spectrum sharing and hardware sharing in the same system. While transmitting information, the system can perceive information such as direction, distance, speed, and detect, track, and identify target devices or events.
- the communication system and the perception system complement each other to achieve overall performance improvement and bring a better service experience.
- radar and communication systems are a typical application of communication-perception integration (communication-perception fusion).
- communication-perception integration communication-perception fusion
- radar systems and communication systems were strictly distinguished due to different research objects and focuses, and the two systems were studied independently in most scenarios.
- radar and communication systems are also typical ways of sending, acquiring, processing and exchanging information.
- system architectures, and frequency bands There are many similarities in their working principles, system architectures, and frequency bands.
- both communication systems and perception systems are based on electromagnetic wave theory, and use the emission and reception of electromagnetic waves to acquire and transmit information;
- both communication systems and perception systems have structures such as antennas, transmitters, receivers, and signal processors, and there is a lot of overlap in hardware resources; with the development of technology With the development of the two systems, there is more and more overlap in the operating frequency bands; in addition, there are similarities in key technologies such as signal modulation and reception detection, waveform design, etc.
- the integration of communication and radar systems can bring many advantages, such as cost savings, size reduction, power consumption reduction, spectrum efficiency improvement, and mutual interference reduction, thereby improving the overall performance of the system.
- Base station echo sensing In this sensing mode, base station A sends a first signal and performs sensing measurement by receiving an echo of the first signal.
- Base station B receives the first signal sent by base station A and performs perception measurement.
- Base station A receives the first signal sent by terminal A and performs perception measurement.
- Terminal B receives the first signal sent by base station B and performs perception measurement.
- Terminal A sends a first signal and performs perception measurement by receiving an echo of the first signal.
- Terminal B receives the first signal sent by terminal A and performs perception measurement.
- each perception method in Figure 1 uses a first signal sending node and a first signal receiving node as examples.
- one or more different perception methods can be selected according to different perception use cases and perception requirements, and each perception method can have one or more sending nodes and receiving nodes.
- the perception targets in Figure 1 use people and cars as examples, and it is assumed that neither people nor cars carry or install signal receiving/transmitting equipment. The perception targets in actual scenarios are richer.
- the first signal in this article includes at least one of a reference signal, a synchronization signal, a data signal and a dedicated signal.
- the sensing service can be supported by receiving and/or sending the first signal.
- the sensing measurement quantity or the sensing result can be obtained by receiving and/or sending the first signal.
- the sensing result refers to the result that meets the sensing requirements, such as: the shape of the sensing target, two-dimensional (2-dimention, 2D)/3D environment reconstruction, spatial position, orientation, displacement, moving speed, acceleration; radar-type sensing of the speed, distance, angle measurement/imaging of the target object; whether a person/object exists; sensing targets such as human movements, gestures, breathing rate, heart rate, sleep quality, etc.
- the first signal may be a signal that does not contain transmission information, such as the existing LTE/NR synchronization and reference signals, including synchronization signals and physical broadcast channels (Synchronization Signal and PBCH block, SSB) signals, channel state information reference signals (CSI-RS), demodulation reference signals (DMRS), channel detection reference signals (Sounding Reference Signal, SRS), positioning reference signals (PRS), phase tracking reference signals (PTRS), etc.; it may also be a single-frequency continuous wave (CW), frequency modulated continuous wave (FMCW), and ultra-wideband Gaussian pulse commonly used in radars; it may also be a newly designed dedicated signal with good correlation characteristics and low peak-to-average power ratio, or a newly designed synaesthesia integrated signal, which not only carries certain information, but also has good perception performance.
- the new signal is at least one dedicated first signal/reference signal, and at least one communication signal is spliced/combined/superimposed in the time domain and/or frequency domain.
- Handover is triggered by the movement of the terminal in the connected state.
- the basic goals of handover are: to indicate that the terminal can communicate with a cell with better channel quality than the current serving cell; to provide the terminal with continuous and uninterrupted communication services, and to effectively prevent call drops caused by deterioration of the cell's signal quality.
- the handover process in 5G includes the following steps (basically similar to Long Term Evolution (LTE)):
- Step 1 Trigger measurement: After the terminal completes access or handover successfully, the base station will send measurement control information to the terminal through the Radio Resource Control (RRC) connection reconfiguration. In addition, if the measurement configuration information is updated, the base station will also send the updated measurement control information through the RRC connection reconfiguration message.
- RRC Radio Resource Control
- the most important part of the measurement control information is the measurement object, measurement report (MR) configuration, measurement event, etc.
- Table 3 Specific criteria for determining measurement events.
- Ms represents the measurement result of the serving cell
- Mn represents the measurement result of the neighboring area
- TimeToTrig indicates the duration of time that the event entry condition is continuously met, i.e., time delay
- Off means the bias of the measurement result, with a step size of 0.5db;
- Hys represents the amplitude hysteresis of the measurement result, with a step size of 0.5db;
- Ocs represents the serving cell specific offset (CellIndividualOffset, CIO);
- Ocn represents the CIO of the neighboring area in the system
- Thresh is the threshold value configured for the corresponding event.
- the event value range in 5G NR is different from that in LTE.
- the value (Range) corresponds to the value reported in the measurement report, and the unit (Value) corresponds to its actual value.
- the Value range in LTE is -140 to -44dBm, while that in NR is -156 to -31dBm.
- NR allows higher and lower reception levels.
- RSRP Reference Signal Receiving Power
- Step 2 Perform measurement: According to the relevant configuration of measurement control, the terminal monitors the wireless channel, but when the measurement report conditions are met, it reports to the base station through events.
- the trigger of the measurement report quantity/event can be RSRP, Reference Signal Received Quality (RSRQ) or Signal to Interference plus Noise Ratio (SINR).
- Step 3 Target decision: The base station uses measurement as the basic resource and selects the switching cell in a first-report-first-process manner, and selects the corresponding switching strategy (such as switching and redirection).
- Step 4 Handover execution: The source base station applies for and allocates resources to the target base station, and then the source base station makes a handover execution decision and sends a handover command to the terminal, which executes the handover and forwards data.
- perception can be divided into non-contact perception and contact perception.
- Contact perception requires the installation of sensors such as thermometers/hygrometers/barometers/gyroscopes/accelerometers/gravity sensors on the sensing target to collect specific sensing information; non-contact perception requires light, sound or radio waves as the medium for perception transmission, so it can be divided into light perception (such as ordinary cameras that use visible light, infrared rays, etc. as media for perception, infrared cameras, lidar, etc.), sound perception (such as sonar that uses mechanical waves and ultrasonic waves for perception) and radio wave perception (such as millimeter wave radar, etc.).
- Figure 1 illustrates different types of perception.
- Non-RF perception is radio wave perception. Radio wave signals are transmitted During the broadcast process, the amplitude, phase and other characteristics of the signal are affected by the surrounding environment.
- RF sensing supports the separation of sensing nodes and sensing targets. Compared with non-RF sensing, it is less affected by weather and light conditions, has a relatively larger sensing range, and is more flexible. It has more advantages than cameras in terms of privacy and security.
- the typical application of RF sensing is radar, including radar used in the military field and commercial millimeter-wave radar.
- the perception in communication perception integration belongs to radio frequency perception.
- this application divides various types of perception into communication perception integration perception (hereinafter referred to as synaesthesia perception) and sensor perception, where sensor perception includes all other perception methods except synaesthesia perception. Synaesthesia perception and sensor perception complement each other.
- 6G will make comprehensive use of different types of perception, including synaesthesia perception and sensor perception, to collect various types of data, so as to realize a more convenient, efficient and accurate digital construction of the physical world.
- the sensing function network element in the present application may also be referred to as a sensing network element or a sensing network function, and may be located on the radio access network (RAN) side or the core network side, that is, it may be a network node in the core network and/or RAN that is responsible for at least one function such as sensing request processing, sensing resource scheduling, sensing information interaction, and sensing data processing.
- RAN radio access network
- RAN radio access network
- LMF location management function
- the functional characteristics of the sensing function network element may include at least one of the following:
- a wireless signal sending device and/or a wireless signal measuring device including a target terminal or a serving base station of the target terminal or a base station associated with a target area
- the target information includes a sensing processing request, sensing capability, sensing auxiliary data, a sensing measurement quantity type, sensing resource configuration information, etc., so as to obtain a value of a target sensing result or a sensing measurement quantity (uplink measurement quantity or downlink measurement quantity) sent by the wireless signal measuring device; wherein the wireless signal may also be referred to as a first signal.
- the sensing method to be used is determined based on factors such as the type of sensing service, sensing service consumer information, required sensing service quality (QoS) requirement information, the sensing capability of the wireless signal transmitting device, and the sensing capability of the wireless signal measuring device.
- the sensing method may include: base station A sends and base station B receives, or the base station sends and the terminal receives, or base station A sends and receives by itself, or the terminal sends and the base station receives, or the terminal sends and receives by itself, or terminal A sends and terminal B receives, etc.
- the perception device serving the perception service based on factors such as the type of perception service, information about the perception service consumer, required perception QoS requirement information, the perception capability of the wireless signal sending device, and the perception capability of the wireless signal measuring device, wherein the perception device includes a wireless signal sending device and/or a wireless signal measuring device.
- the terminal involved in this application can be a mobile phone, a tablet computer (Tablet Personal Computer), a laptop Laptop Computer, also known as laptop computer, personal digital assistant (PDA), handheld computer, netbook, ultra-mobile personal computer (UMPC), mobile Internet device (MID), augmented reality (AR)/virtual reality (VR) equipment, robot, wearable device (Wearable Device), vehicle-mounted equipment (VUE), pedestrian terminal (Pedestrian User Equipment, PUE), smart home (home equipment with wireless communication function, such as refrigerator, TV, washing machine or furniture, etc.), game console, personal computer (PC), teller machine or self-service machine and other terminal side equipment, wearable devices include: smart watch, smart bracelet, smart headset, smart glasses, smart jewelry (smart bracelet, smart bracelet, smart ring, smart necklace, smart anklet, smart anklet, etc.), smart wristband, smart clothing, game console, etc. It should be noted that the specific type of terminal is not limited in the embodiment of the present application.
- the core network equipment involved in the present application may include but is not limited to at least one of the following: core network nodes, core network functions, mobility management entity (Mobility Management Entity, MME), AMF, LMF, session management function (Session Management Function, SMF), user plane function (User Plane Function, UPF), policy control function (Policy Control Function, PCF), policy and charging rules function unit (Policy and Charging Rules Function, PCRF), edge application service discovery function (Edge Application Server Discovery Function, EASDF), unified data management (U nified Data Management, UDM), Unified Data Repository (UDR), Home Subscriber Server (HSS), Centralized network configuration (CNC), Network Repository Function (NRF), Network Exposure Function (NEF), Local NEF (or L-NEF), Binding Support Function (BSF), Application Function (AF), etc.
- MME mobility management entity
- AMF Access Management Entity
- LMF session management function
- Session Management Function SMF
- User Plane Function User Plane Function
- the network side equipment involved in the present application may include access network equipment or core network equipment, wherein the access network equipment may also be referred to as wireless access network equipment, wireless access network (Radio Access Network, RAN), wireless access network function or wireless access network unit.
- the access network equipment may include base stations, relay stations, wireless local area network (Wireless Local Area Network, WLAN) access points or WiFi nodes, etc.
- the base station may be referred to as Node B (Node B, NB), Evolved Node B (Evolved Node B, eNB), access point, base transceiver station (Base Transceiver Station, BTS), radio base station, radio transceiver, Basic Service Set (Basic Service Set, BSS), Extended Service Set (Extended Service Set, ESS), Transmission Reception Point (Transmission Reception Point, TRP) or other appropriate terms in the field, as long as the same technical effect is achieved, the base station is not limited to specific technical vocabulary, it should be noted that in the embodiments of the present application, only the base station in the NR system is used as an example for introduction, and the specific type of the base station is not limited.
- the first signal in the present application may also be referred to as a perception signal or a synaesthesia signal, that is, the perception service may be supported by transmitting and/or receiving the first signal, for example, a perception measurement quantity or a perception result may be obtained by transmitting and/or receiving the first signal.
- the first signal may be a signal that does not contain transmission information, such as an existing LTE/NR synchronization and reference signal, or
- the first signal may include at least one of an SSB signal, a channel state information reference signal (CSI-RS), a demodulation reference signal (DMRS), a channel sounding reference signal (SRS), a positioning reference signal (PRS), and a phase tracking reference signal (PTRS); or, the first signal may also be a single-frequency continuous wave (CW), a frequency modulated continuous wave (FMCW), and an ultra-wideband Gaussian pulse commonly used in radars; or, the first signal may also be a newly designed dedicated signal with good correlation characteristics and a low peak-to-average power ratio, or a newly designed synaesthesia integrated signal, which carries certain information and has good perception performance.
- the new signal is a splicing/combination/superposition of at least one dedicated perception signal/reference signal and at least one communication signal in the time domain and/or frequency domain.
- an embodiment of the present application provides a mobility management method, and the specific steps include: step 401 and step 402 .
- Step 401 The first node performs a sensing measurement to obtain a measurement result
- the first node receives the first signal and performs a perception measurement to obtain a measurement result, where the measurement result includes a perception measurement value, wherein the perception measurement may also be referred to as a switching measurement.
- Step 402 The first node sends a measurement report to the second node, where the measurement report includes the measurement result, and the measurement result is used to determine whether to perform perception switching; wherein the measurement result includes at least one of the following: a measurement quantity perceived by the sensor; a fused measurement result, where the fused measurement result is obtained by fusion processing of the measurement quantity perceived by the sensor and the measurement quantity perceived by the synaesthesia class.
- the perception switching includes switching of a perception node and/or switching of a perception mode.
- the sensor-type perception includes at least one associated perception of lidar, millimeter-wave radar, visual sensor (including monocular vision, binocular vision, infrared sensor), inertial measurement unit, and various other sensors (such as rain gauge, thermometer, hygrometer).
- visual sensor including monocular vision, binocular vision, infrared sensor
- inertial measurement unit and various other sensors (such as rain gauge, thermometer, hygrometer).
- the measurement result further includes a measurement amount of synaesthesia perception
- the measurement amount of synaesthesia perception includes at least one of the following:
- the first-level measurement quantity includes: at least one of a received signal/channel response complex result, amplitude/phase, I path/Q path and a calculation result thereof;
- the second-level measurement quantity includes: at least one of delay, Doppler, angle, and intensity;
- the third-level measurement quantity includes: at least one of distance, speed, direction, spatial position, and acceleration;
- the fourth-level measurement quantity includes at least one of the following: whether the target exists, trajectory, action, expression, vital sign, quantity, imaging result, weather, air quality, shape, material, and composition.
- the above-mentioned synaesthesia perception refers to a perception method of sending radio signals through a communication and perception integrated system and receiving target reflected echo signals to perform perception services.
- the above-mentioned sensor-type perception refers to: a perception method that performs perception services by means other than the integrated communication and perception system.
- Typical devices that perform sensor-type perception include: lidar, millimeter-wave radar, visual sensors (including: monocular vision, binocular vision, infrared sensors), inertial measurement units (Inertial Measurement Unit, IMU), and various other sensors (rain gauge, thermometer, hygrometer, etc.).
- the first node in this application is a device that supports sensor-type perception or a device that has the ability to perform sensor-type perception, and the first node also has communication capabilities.
- the device that supports sensor-type perception can be a terminal of a special form (for example, a sensor module that is configured with a communication module and a sensor module that can perform sensor-type perception), or it can be an application function/application server that can provide sensor-type perception information.
- an application function/application server the typical situation is as follows:
- a sensor equipped on a terminal or a base station belongs to a different hardware and processing domain from the terminal or the base station (for example, the sensor on the terminal is usually used by the application processor (AP) and the operating system or application (APP)), so from the network perspective it is an application server.
- AP application processor
- APP operating system or application
- the measurement quantities (e.g., images) obtained by the perception performed by a device such as a camera that supports sensor-type perception are fixedly reported to an application server deployed at a certain location through a wired transmission network.
- the application server can process the measurement quantities and is connected to one or more camera devices of the same type.
- the perception function network element requests the application function/application server for perception information through the Network Exposure Function (NEF).
- the perception information includes the perception measurement quantity/perception result, the time information of the perception execution, etc.
- the application function/application server registers the perception information that can be provided and the information related to the device performing the perception (for example, location, antenna orientation, etc.) with the network function.
- the method further includes:
- the first node receives a switching command, where the switching command is used to instruct the first node to perform sensing as a target sensing node;
- the first node performs sensing according to the switching command.
- the perception mode of the target perception node is the same as or different from the perception mode of the source perception node, and the perception mode includes: at least one of sensor-type perception and synaesthesia-type perception, and the synaesthesia-type perception includes at least one of downlink perception, uplink perception, terminal echo perception, base station echo perception, air interface perception between base stations, and air interface perception between terminals.
- the method further includes:
- the first node receives measurement configuration information sent by the second node, where the measurement configuration information is used to configure the first node to perform perception measurement.
- the measurement configuration information includes at least one of the following:
- Measurement events where the measurement events include at least one of the following: a sensed measurement quantity or a calculation result of a sensed measurement quantity or a sensed performance meets a preset condition; a state of a sensed target changes; or a position of a sensed node involved in the sensed event changes.
- the measurement quantity sensed by the sensor includes at least one of the following:
- Measurement quantities including at least one of the following: target presence, trajectory, movement, expression, vital signs, quantity, imaging results, weather, air quality, shape, material, and composition.
- the laser radar-related measurement includes at least one of the following: laser radar point cloud data, angle and/or distance of a target obtained based on the laser radar point cloud data, visual features of a target identified from the laser radar point cloud data, and the number of targets identified from the laser radar point cloud data;
- the vision-related measurement includes at least one of the following: a visual image, luminosity of image pixels, red/green/blue (RGB) values of image pixels, visual features of targets identified from the image, angles and/or distances of targets identified from the image, and the number of targets identified from the image;
- RGB red/green/blue
- the radar-related measurements include at least one of the following: radar point cloud, distance, speed, and/or angle of identified targets, radar imaging, and number of targets;
- the measurement quantity related to the inertial measurement unit includes at least one of the following: acceleration, velocity, and angular velocity.
- the method further includes:
- a first node performs perception measurement to obtain a measurement result; the first node sends a measurement report to a second node, the measurement report includes the measurement result, the measurement result is used to determine whether to perform perception switching, and the measurement result includes at least one of the following: a measurement quantity perceived by a sensor; a fused measurement result, the fused measurement result is obtained by fusion processing of a measurement quantity perceived by the sensor and a measurement quantity perceived by the synaesthesia type, and perception mobility management based on sensor perception is implemented, which can ensure the continuity of perception services and improve the user experience of perception services.
- an embodiment of the present application provides a mobility management method, and specific steps include: step 501 , step 502 , step 503 , and step 504 .
- Step 501 The second node receives a measurement report sent by the first node, where the measurement report includes a measurement result;
- Step 502 The second node sends a first request message to the third node, where the first request message is used to request the third node to perform perception;
- the perception mode of the target perception node is the same as or different from the perception mode of the source perception node; the perception mode includes: at least one of sensor type perception and synaesthesia type perception, and the synaesthesia type perception includes at least one of downlink perception, uplink perception, terminal echo perception, base station echo perception, air interface perception between base stations, and air interface perception between terminals.
- the third node may include at least one of a candidate target terminal (User Equipment, UE), a candidate target base station, a candidate target application function, and a candidate target application server.
- UE User Equipment
- UE User Equipment
- Step 503 The second node receives first response information sent by the third node, where the first response information is used to indicate that the third node agrees to perform sensing;
- Step 504 the second node sends a switching command to the third node, where the switching command is used to instruct the third node to perform sensing as a target sensing node;
- the second node selects a target UE, a target base station, a target application function or a target application server from the candidate target UEs, the candidate target base stations, the candidate target application functions or the candidate target application servers based on the received first response information to send a switching command.
- the measurement result includes at least one of the following: a measurement quantity perceived by a sensor; and a fused measurement result, wherein the fused measurement result is obtained by fusion processing of the measurement quantity perceived by the sensor and the measurement quantity perceived by the synaesthesia class.
- the measurement result further includes a measurement amount of synaesthesia perception
- the measurement amount of synaesthesia perception includes at least one of the following:
- the first-level measurement quantity includes: at least one of a received signal/channel response complex result, amplitude/phase, I path/Q path and a calculation result thereof;
- the second-level measurement quantity includes: at least one of delay, Doppler, angle, and intensity;
- the third-level measurement quantity includes: at least one of distance, speed, direction, spatial position, and acceleration;
- the fourth level of measurement includes: whether the target exists, trajectory, action, expression, life form At least one of features, quantity, imaging results, weather, air quality, shape, material, and composition.
- the method further includes:
- the second node obtains first information of the sensing node
- the first information includes at least one of the following: information of the sensing node; sensing capability information of the sensing node; and sensing authority information.
- the measurement result in the measurement report includes one of the following:
- a fused measurement result wherein the fused measurement result is obtained by fusion processing of a measurement result of a sensor-type perception measurement and a measurement result of a synaesthesia-type perception measurement.
- the method further includes:
- the second node sends first measurement configuration information to the first node, where the first measurement configuration information is used to configure the first node to perform sensor-type perception measurement.
- the first measurement configuration information includes at least one of the following:
- Measurement events where the measurement events include at least one of the following: a sensed measurement quantity or a calculation result of a sensed measurement quantity or a sensed performance meets a preset condition; a state of a sensed target changes; or a position of a sensed node involved in the sensed event changes.
- the measurement quantity sensed by the sensor includes at least one of the following:
- Measurement quantities including at least one of the following: target presence, trajectory, movement, expression, vital signs, quantity, imaging results, weather, air quality, shape, material, and composition.
- the laser radar-related measurement includes at least one of the following: laser radar point cloud data, angle and/or distance of a target obtained based on the laser radar point cloud data, visual features of a target identified from the laser radar point cloud data, and the number of targets identified from the laser radar point cloud data;
- the vision-related measurement includes at least one of the following: a visual image, the luminosity of an image pixel, the RGB value of an image pixel, a visual feature of an object identified from the image, an angle and/or distance of an object identified from the image, and the number of objects identified from the image;
- the radar-related measurements include at least one of the following: radar point cloud, distance, speed, and/or angle of identified targets, radar imaging, and number of targets;
- the measurement quantity related to the inertial measurement unit includes at least one of the following: acceleration, velocity, and angular velocity.
- the first node includes a terminal, and the second node includes a source base station and/or a perception function network element, or the first node includes: a source base station and/or a candidate target base station, and the second node includes a perception function network element; or the first node includes a source terminal and/or a candidate target terminal, and the second node includes a perception function network element; or the first node includes a candidate target base station, and the second node includes a source base station; or the first node includes: an application function and/or an application server, and the second node includes a source base station and/or a perception function network element.
- a second node receives a measurement report sent by a first node, where the measurement report includes a measurement result, and the measurement result includes at least one of the following: a measurement quantity perceived by a sensor; a fused measurement result, where the fused measurement result is obtained by fusion processing of a measurement quantity perceived by the sensor and a measurement quantity perceived by a synaesthesia type; and the second node sends a first request message to a third node, where the first request message is used to request the third node to perform perception, thereby implementing sensor-based perception mobility management, which can maintain the continuity of the perception service and improve the user experience of the perception service.
- Embodiment 1 Perception mobility management (perception switching) based on sensor perception.
- Perception switching includes: possible switching of perception nodes (base stations or UEs) participating in perception services such as trajectory tracking or continuous speed measurement of targets, and possible switching of perception modes (see Figure 1).
- the base station or UE is already performing perception before the switching.
- the base station that performs perception before the switching is called the source base station, and the base station that performs perception after the switching is called the target base station; or the UE that performs perception before the switching is called the source UE, and the UE that performs perception after the switching is called the target UE;
- the source base station and the UE perform downlink sensing, and then switch to the target base station and the UE for downlink sensing, see Figure 6;
- the source base station and the source UE perform uplink sensing, and then switch to the target base station and the target UE for uplink sensing, see Figure 7;
- Base station A performs self-transmitting and self-receiving sensing, and switches to base station B for sensor sensing;
- UE performs uplink perception and switches to UE performing sensor perception.
- Step 1 The base station and/or UE performs perception measurement, which includes sensor-based perception measurement;
- the source base station sends target measurement configuration information to the UE. After receiving the target measurement configuration information, the UE performs perception measurement and feeds back a measurement report to the source base station.
- the source base station in mode 1 is equivalent to the second node, and the UE is equivalent to the first node.
- Method 2 The first device (for example, a perception function network element) sends target measurement configuration information to the UE. After receiving the target measurement configuration information, the UE performs perception measurement and feeds back a measurement report to the first device. Optionally, the UE or the first device sends a measurement report to the source base station.
- a perception function network element for example, a perception function network element
- the first device is equivalent to the second node
- the UE is equivalent to the first node
- Mode 3 The first device sends target measurement configuration information to the source base station and/or the target base station. After receiving the target measurement configuration information, the source base station and/or the target base station performs perception measurement and feeds back a switching measurement report to the first device or the source base station.
- the first device in mode 3 is equivalent to the second node, and the source base station and/or the target base station is equivalent to the first node.
- the target measurement configuration information includes at least one of the following:
- the sensor ID or type used for example, which sensor or type of sensor is used, such as a visual camera, a lidar, or a millimeter-wave radar;
- the reported measurement quantity includes a measurement quantity sensed by the sensor
- the measurement quantity sensed by the sensor includes at least one of the following:
- lidar-related measurements including at least one of the following:
- LiDAR point cloud data Each point in the LiDAR point cloud data includes: X/Y/Z position information, and,additional information;
- Visual features of objects identified from LiDAR point cloud data such as people, vehicles, etc.
- the number of objects identified from the LiDAR point cloud data is the number of objects identified from the LiDAR point cloud data.
- the additional information in the lidar point cloud data includes at least one of the following:
- Echo Count The echo count is the total number of echoes for a given pulse
- Each post-processed lidar point can have a classification that defines the type of object that reflected the lidar pulse.
- LiDAR points can be classified into many categories, such as ground, bare earth, top of tree canopy, and water.
- RGB Red Green and Blue
- the RGB band can be used as an attribute of the LiDAR data, which usually comes from the effects collected during the LiDAR measurement.
- GPS Global Positioning System
- Scanning direction The direction of travel of the laser scanning mirror. A value of 1 represents a positive scanning direction and a value of 0 represents a negative scanning direction.
- vision-related measurements including at least one of the following:
- Visual features of objects identified from images such as people, vehicles, etc.
- the angle and distance of the objects identified from the image (especially for binocular vision);
- the number of objects identified in the image is the number of objects identified in the image.
- Radar-related measurements including at least one of the following:
- each point in the point cloud includes: at least one of distance/speed/azimuth/elevation angle, or at least one of X/Y/Z/speed;
- the inertial measurement unit related measurement quantity includes at least one of the following:
- Acceleration at least one of the three directions X/Y/Z;
- Angular velocity around at least one of the three axes X/Y/Z.
- the size and resolution of images reported by cameras For example, the size and resolution of images reported by cameras, the resolution and accuracy requirements of distance/speed/angle in point cloud data reported by lidar, etc.
- the perception condition includes at least one of a perception start time, a perception end time, and a perception duration;
- the prior information of the sensed target or sensed area includes at least one of the sensed target type, the approximate location/area of the sensed target, the historical state of the sensed target (speed, angle, distance, acceleration, spatial orientation), etc.;
- the measurement report configuration includes at least one of the following: reporting principle, which may be periodic reporting or event-triggered reporting; the type of reference signal used for measurement; the measurement report format, such as the maximum number of cells and beams reported;
- the measurement event and related parameters include at least one of the following:
- the perceived measurement quantity or the calculation result of the perceived measurement quantity or the perceived performance meets the preset conditions
- the state of the perceived target changes (state includes position, speed, etc.);
- the perception measurement also includes the measurement of synaesthesia perception.
- the measurement of synaesthesia perception includes at least one of the following:
- the source base station sends first measurement configuration information to at least one UE. After receiving the measurement configuration information, the UE performs perception measurement and feeds back a measurement report to the source base station.
- the UE is the first node and the source base station is the second node.
- Method 2 The first device (such as a perception function network element) sends first measurement configuration information to at least one UE. After receiving the first measurement configuration information, the UE performs perception measurement and feeds back a measurement report to the first device. Optionally, at least one UE or the first device sends a measurement report to the source base station.
- the first device such as a perception function network element
- the UE is the first node, and the first device is the second node.
- the first measurement configuration information includes at least one of the following:
- parameter information and resource information of one or more first signals sent by the source base station and the candidate target base station that the UE needs to measure For example, parameter information and resource information of one or more first signals sent by the source base station and the candidate target base station that the UE needs to measure
- the parameter information of the first signal includes at least one of the following:
- Waveforms such as Orthogonal Frequency Division Multiplexing (OFDM), Single-carrier Frequency-Division Multiple Access (SC-FDMA), Orthogonal Time-Frequency-Space Modulation, Frequency Modulated Continuous Wave, Pulse Signal, etc.
- OFDM Orthogonal Frequency Division Multiplexing
- SC-FDMA Single-carrier Frequency-Division Multiple Access
- OFDMA Orthogonal Time-Frequency-Space Modulation
- Frequency Modulated Continuous Wave Pulse Signal, etc.
- Subcarrier spacing For example, the subcarrier spacing of the OFDM system is 30KHz;
- Guard interval the time interval from the moment a signal ends to the moment the latest echo signal of the signal is received; this parameter is proportional to the maximum perception distance; for example, it can be calculated by 2dmax/c, where dmax is the maximum perception distance (belongs to the perception requirement). For example, for a first signal that is self-transmitted and self-received, dmax represents the maximum distance from the first signal receiving and transmitting point to the signal transmitting point; in some cases, the OFDM signal cyclic prefix can play the role of the minimum guard interval; c is the speed of light;
- This parameter is inversely proportional to the rate resolution (a perception requirement). This parameter is the time span of the first signal, mainly for calculating the Doppler frequency deviation. This parameter can be calculated by c/2/delta_v/fc, where delta_v is the velocity resolution; fc is the signal carrier frequency or the center frequency of the signal;
- Time interval This parameter can be calculated by c/2/fc/v_range; where v_range is the maximum rate minus the minimum rate (which belongs to the perception requirement); this parameter is the time interval between two adjacent first signals;
- the power information of the transmitted signal includes the transmit power, peak power, average power, total power, power spectrum density, EIRP, power per port, etc.
- the transmit power ranges from -20dBm to 23dBm with a value of 2dBm;
- Signal format such as SRS, DMRS, PRS, etc., or other predefined signals, and related sequence format (sequence format is associated with sequence content or sequence length, etc.) and other information;
- signal direction for example, direction or beam information of the first signal
- the first signal includes multiple resources, each resource is associated with an SSB QCL, and the QCL includes Type A, B, C or D;
- Antenna configuration parameters (applicable to the transmission and reception of the first signal by a multi-antenna device), such as: at least one of the transmitting antenna orthogonality mode (TDM/CDM/FDM/DDM, etc.), the number of antenna ports, the number of antenna units, the distance between antenna units, the number of receiving channels, the number of transmitting channels, the number of transmitting antennas, and the (maximum) number of uplink or downlink MIMO layers.
- TDM/CDM/FDM/DDM transmitting antenna orthogonality mode
- the resource information of the first signal includes at least one of the following:
- time resources such as the time slot index where the first signal is located or the symbol index of the time slot; wherein the time resources are divided into two types, one is a one-time time resource, such as one symbol sending an omnidirectional first signal/first signal; the other is a non-one-time time resource, such as multiple groups of periodic time resources or discontinuous time resources (which may include a start time and an end time), each group of periodic time resources sends the first signal in the same direction, and different groups of periodic time resources have different beam directions;
- Frequency resources including the center frequency, bandwidth, RB or subcarrier of the first signal.
- the measurement report configuration includes at least one of the following: a reporting principle, which may be periodic reporting or event-triggered reporting; a type of reference signal used for measurement, etc.; a measurement report format, such as the maximum number of cells and beams reported, etc.;
- the measurement ID is used to associate the measurement object with the measurement report configuration
- the measurement event and related parameters include at least one of the following:
- the perceived measurement quantity or the calculation result of the perceived measurement quantity or the perceived performance meets the preset conditions
- the state of the perceived target changes (state includes position, speed, etc.);
- the size and resolution of images reported by cameras For example, the size and resolution of images reported by cameras, the resolution and accuracy requirements of distance/speed/angle in point cloud data reported by lidar, etc.
- the perception condition includes at least one of a perception start time, a perception end time, and a perception duration;
- the prior information of the perception target or perception area includes at least one item such as the perception target type, the approximate location/area of the perception target, the perception target historical state (speed, angle, distance, acceleration, spatial orientation), etc.
- the UE may determine whether the measurement event is satisfied based on the average value of multiple measurement quantities/indicators at different times (layer 1 filtering and/or layer 3 filtering) to avoid the randomness/ping-pong effect caused by judging based on a single result;
- multiple synchronization signals/reference signals/first signals may correspond to multiple transmit/receive beam pairs, and the UE may determine whether the measurement event is satisfied based on the measurement quantity/indicator of one or more beams.
- the measurement report at least includes the measurement results of the perceptual measurement quantity required for the perceptual measurement
- the perception measurement amount required for the handover measurement may include the current perception service perception measurement amount.
- the source base station and at least one candidate target base station perform perception measurement, that is, the source base station and at least one candidate target base station are equivalent to the first node.
- the first device before the source base station and the candidate target base station perform perception measurement, the first device sends second measurement configuration information to the source base station and the candidate target base station, or the source base station sends the second measurement configuration information to the candidate target base station;
- the source base station sends parameter information of the first signal to at least one UE, and the at least one UE sends the first signal according to the parameter information of the first signal, so that the source base station and the candidate target base station measure the first signal;
- the second measurement configuration information includes at least one of the following:
- Measurement object for example, parameter information and resource information of one or more first signals sent by the UE that the base station needs to measure;
- the measurement report configuration includes at least one of the following: a principle for the base station to report to the first device or the candidate target base station to report to the source base station, which may be periodic reporting or event-triggered reporting; a type of reference signal used for measurement, etc.; a measurement report format, such as the maximum number of cells and the number of beams reported, etc.;
- Measurement ID The measurement ID is used to associate the measurement object and the measurement report configuration
- the measurement event and related parameters include at least one of the following:
- the perceived measurement quantity or the calculation result of the perceived measurement quantity or the perceived performance meets the preset conditions
- the state of the perceived target changes (state includes position, speed, etc.);
- the communication-related indicators of one or more first signals received by the serving cell and/or the neighboring cell meet the preset conditions; wherein the communication-related indicators include at least one item of RSRP, SINR, RSRQ, RSSI, etc.; for example, the communication-related indicators of the candidate target cell are better than the communication-related indicators of the source cell within a preset time period.
- the size and resolution of the image reported by the camera For example, the size and resolution of the image reported by the camera, the distance/speed/angle in the point cloud data reported by the lidar Resolution and accuracy requirements, etc.
- the perception condition includes at least one of a perception start time, a perception end time, and a perception duration;
- the prior information of the perception target or perception area includes at least one item such as the perception target type, the approximate location/area of the perception target, the perception target historical state (speed, angle, distance, acceleration, spatial orientation), etc.
- the source base station or the candidate target base station may determine whether the measurement event is satisfied based on the average value of multiple measurement quantities/indicators at different times (layer 1 filtering and/or layer 3 filtering) to avoid the randomness/ping-pong effect caused by judging based on a single result;
- multiple first signals may correspond to multiple receive/transmit beam pairs, and the source base station or the candidate target base station may determine whether the measurement event is satisfied based on the measurement quantity/indicator of one or more beams.
- the source base station sends third measurement configuration information to at least one UE. After receiving the measurement configuration information, the at least one UE performs perception measurement and feeds back a measurement report to the source base station.
- the UE is equivalent to the first node, and the source base station is equivalent to the second node.
- Method 2 The first device (for example, a perception function network element) sends third measurement configuration information to at least one UE. After receiving the third measurement configuration information, the at least one UE performs perception measurement and feeds back a measurement report to the first device. Optionally, at least one UE or the first device sends a measurement report to the source base station.
- a perception function network element for example, a perception function network element
- the UE is equivalent to the first node, and the first device or source base station is equivalent to the second node.
- the first device sends fourth measurement configuration information to the source base station and/or at least one target base station.
- the source base station and/or at least one target base station After receiving the measurement configuration information, the source base station and/or at least one target base station perform perception measurement and feedback a switching measurement report to the first device or the source base station, that is, the source base station and/or the target base station is equivalent to the first node, and the first device is equivalent to the second node, or, the target base station is equivalent to the first node, and the first device or the source base station is equivalent to the second node.
- the first device sends fifth measurement configuration information to the source base station and/or at least one target base station. After receiving the measurement configuration information, the source base station and/or at least one target base station performs perception measurement and feeds back a measurement report to the first device or the source base station.
- the source base station and/or at least one target base station is equivalent to the first node, and the first device or the source base station is equivalent to the second node.
- the first device sends sixth measurement configuration information to the source UE and/or at least one target UE. After receiving the measurement configuration information, the source UE and/or at least one target UE performs perception measurement and feeds back a measurement report to the first device or the source base station.
- the source UE and/or at least one target UE is equivalent to the first node, and the first device or the source base station is equivalent to the second node
- the first device or the source base station obtains a sensing node (for example, a source base station and/or at least one target base station) A base station, and first information of a source UE and/or at least one target UE); wherein a sensing node refers to: a node or device with a communication sensing integration function or a sensor sensing function.
- a sensing node refers to: a node or device with a communication sensing integration function or a sensor sensing function.
- the first information includes at least one of the following:
- the sensing node information includes at least one of the following:
- sensing node The type of sensing node
- devices with synaesthesia perception capabilities For example, devices with synaesthesia perception capabilities, devices with sensor perception capabilities, and devices with synaesthesia perception and sensor perception capabilities.
- Devices with synaesthesia perception and sensor perception capabilities should also include the type of sensor (for example, visual sensor, lidar, etc.);
- it may be the coordinates of the sensing node in the global coordinate system, or the coordinates relative to a reference position, and the coordinates may be rectangular coordinates or polar coordinates;
- it may be the orientation of the antenna panel of the sensing node or the rotation angle of the local coordinate system relative to the global coordinate system, or the rotation angle relative to a reference coordinate system, where the rotation angle includes azimuth, pitch and roll angles;
- it can be the speed of the sensing node in the global coordinate system, or the speed relative to a reference coordinate system, where the speed includes the magnitude and direction of the speed.
- this item can be omitted or simplified.
- the method for obtaining the first information includes at least one of the following:
- At least part of the first information of the signaling request reply is sent to a perception node with communication function, and at least part of the first information sent by the perception node is received.
- the measurement results of sensor perception and synaesthesia integrated perception can be reported separately or fused before reporting (applicable to devices with both synaesthesia integrated perception and sensor-type perception capabilities). Alternatively, the measurement results of sensor perception and synaesthesia integrated perception can be reported separately and then fused.
- Step 2 The source base station or the first device decides whether to initiate a handover based on the measurement report in step 1.
- the source base station or the first device is equivalent to the second node.
- the source base station reports a measurement report to the first device, and the first device decides whether to initiate a handover request; or, the first device decides whether to initiate a handover request based on a measurement report sent by the UE or the base station.
- the subsequent processing may be to maintain or end the current perception.
- the first device or the source base station determines which candidate target base station and/or which candidate target UE is to be sensed, which is specifically divided into one of the following two situations:
- Case 1 The source base station decides to switch to a candidate target base station and/or a candidate target UE to perform perception.
- the source base station sends a first request message to at least one candidate target base station and/or candidate target UE, where the first request message requests the at least one candidate target base station and/or candidate target UE to perform perception.
- the source base station sends first indication information to the first device, where the first indication information is for notifying the first device that at least one candidate target base station and/or candidate target UE will be sensed.
- Case 2 The first device decides to switch to a candidate target base station and/or a candidate target UE to perform perception.
- the first device sends first request information to at least one candidate target base station and/or candidate target UE.
- the first device sends first indication information to the source base station and/or the candidate target UE.
- the first request information may include a soft handover request.
- the determination of the candidate target base station or the candidate target UE is based on at least one of the following information:
- Base station/UE sensing capability information including base station/UE sensing coverage, maximum bandwidth available for sensing, maximum duration of sensing service, supported first signal type and frame format, base station antenna array information (array type, number of antennas, array aperture, antenna polarization characteristics, array element gain and directivity characteristics, etc.));
- Resource information currently available for sensing by the base station/UE including time resources (number of symbols, number of time slots, number of frames, etc.), frequency resources (number of resource blocks (RBs), number of resource elements (REs), total bandwidth, available frequency band location, etc.), antenna resources (number of antennas/antenna subarrays), phase modulation resources (number of hardware phase shifters), orthogonal code resources (length and number of orthogonal codes), etc.);
- Channel state information of the base station/UE including at least one of the channel transfer function/channel impulse response, channel quality indicator (CQI), precoding matrix indicator (PMI), CSI-RS resource indicator, SSB resource indicator, layer indicator (LI), rank indicator (RI) and L1-RSRP of at least one communication link;
- CQI channel quality indicator
- PMI precoding matrix indicator
- CSI-RS resource indicator SSB resource indicator
- LI layer indicator
- RI rank indicator
- L1-RSRP L1-RSRP of at least one communication link
- the first request information includes at least one of the following:
- the perception QoS includes at least one of the following: perception resolution (further divided into: ranging resolution, angle measurement resolution, speed measurement resolution, imaging resolution), etc., perception accuracy (further divided into: ranging accuracy, angle measurement accuracy, speed measurement accuracy, positioning accuracy, etc.), perception range (further divided into: ranging range, speed measurement range, angle measurement range, imaging range, etc.), perception delay (the time interval from the first signal being sent to obtaining the perception result, or the time interval from the initiation of the perception demand to the acquisition of the perception result), perception update rate (the time interval between two adjacent perception executions and obtaining the perception results), detection probability (the probability of being correctly detected when the perception object exists), false alarm probability (the probability of being correctly detected when the perception object exists), and false alarm probability (the probability of being correctly detected when the perception object exists). rate (the probability of falsely detecting a perceived target when the perceived object does not exist), perceived security, perceived privacy);
- the perception measurement result includes a perception result obtained directly or indirectly based on at least one perception measurement quantity
- the perception condition includes at least one of a perception start time, a perception end time, and a perception duration;
- the sensing mode switching success decision condition indicates that the measurement result of at least one sensing measurement amount and/or communication measurement amount reaches a preset threshold within a preset time/preset number of times;
- Step 3 The candidate target base station and/or candidate target UE decides whether to accept the handover/execute the sensing. There are two cases:
- the candidate target base station and/or the candidate target UE sends a first response message to the sender of the first request information (source base station or first device), and the first response message indicates that the sender of the first request information agrees to perform perception.
- the candidate target base station and/or the candidate target UE feeds back the recommended first parameter configuration information in the first response information.
- the first parameter configuration information is used for the candidate target base station and/or the candidate target UE to perform perception parameter configuration for perception.
- the first parameter configuration information includes soft handover parameter configuration information.
- the candidate target base station and/or the candidate target UE sends a first rejection message to the sender of the first request information (the source base station or the first device), where the first rejection message indicates the sender of the first request information, and the sender of the first rejection message does not perceive it.
- the subsequent processing may be one of the following: i. the source base station or the first device re-determines the candidate target base station and/or the candidate target UE; ii. maintains the current perception; iii. ends the current perception;
- the candidate target base station and/or the candidate target UE decides whether to accept the switching/execute the sensing according to their own equipment capabilities; wherein the equipment capabilities include sensing-related equipment capabilities, base station sensing coverage, maximum bandwidth available for sensing, maximum duration of sensing services, supported first signal types and frame formats, base station antenna array information (array type, number of antennas, array aperture, antenna polarization characteristics, array element gain and directivity characteristics, which of the six sensing modes are supported, etc.);
- Step 4 The source base station or the first device determines at least one target base station among the candidate target base stations based on the received first response information, and/or determines at least one target UE among the candidate target UEs as the base station and/or UE that performs perception after switching.
- the source base station or the first device sends a handover command to the target base station and/or the target UE.
- the handover command is used to notify the target sensing node to perform sensing;
- the source base station or the first device feeds back the recommended second parameter configuration information in the handover command.
- Data configuration information is used to configure the perception parameters of the target perception node for perception.
- the content included in the second parameter configuration information may refer to the description of the first parameter configuration information, that is, the content included in the second parameter configuration information may be the same as the content included in the first parameter configuration information.
- the second parameter configuration information includes soft switching parameter configuration information.
- Step 5 The target base station and/or the target UE performs perception (including sensor perception or at least one of the six synaesthesia-integrated perception methods).
- the sensing method may also be switched.
- the subsequent processing is divided into the following two cases:
- Case 1 Using a soft handover method, the target base station configures the sensing parameters and performs sensing (including uplink sensing or downlink sensing) based on at least one of the first request information, the first parameter configuration information, and the second parameter configuration information.
- the target base station After obtaining at least one perception measurement result and/or perception result, the target base station sends a handover success message to the source base station or the first device.
- the sender of the first request information is the source base station, and the source base station and the target base station are not the same device:
- the source base station After receiving the handover success message, the source base station sends a sensing end command to the UE.
- the source base station and the UE end the original sensing operation and release the resources occupied by sensing (including time-frequency resources, antenna port resources, etc.);
- the sender of the first request information is the first device, and the source base station and the target base station are not the same device:
- the first device After receiving the handover success message, the first device sends a sensing end command to the source base station and the UE.
- the source base station and the UE end the original sensing operation and release the resources occupied by the sensing (including time-frequency resources, antenna port resources, etc.);
- Case 2 Using the hard handover method. While executing step 4, the source base station or the first device does not need to wait for the handover success message. This includes one of the following cases:
- the sender of the first request information is the source base station, and the source base station and the target base station are not the same device:
- the source base station sends a sensing end command to the UE.
- the source base station and the UE end the original sensing operation and release the resources occupied by sensing (including time-frequency resources, antenna port resources, etc.);
- the sender of the first request information is the first device, and the source base station and the target base station are not the same device:
- the first device sends a sensing end command to the source base station and the UE.
- the source base station and the UE end the original sensing operation and release the resources occupied by the sensing (including time-frequency resources, antenna port resources, etc.);
- Step 6 the source base station and/or the first device sends part or all of the historical sensing measurement quantities and/or historical sensing results, and sensing target/area priori information to the target base station and/or the target UE.
- Embodiment 2 The measurement results of sensor perception and synaesthesia integrated perception are fused and then reported.
- Scenario 1 Fusion of synaesthesia perception and visual sensors.
- Visual sensors are divided into monocular vision and binocular vision. Monocular vision uses a single camera for perception, while binocular vision uses two cameras for perception. Visual sensors obtain visual images by imaging the target or target area.
- the advantage of visual sensors over synaesthesia is that they can form images and can detect the target based on visual images and algorithms.
- the visual sensor can detect features (e.g., people, vehicles, etc.) by detecting them.
- the visual sensor has the disadvantages of not being able to measure speed and having poor performance in distance measurement.
- the image obtained by the visual sensor is combined with the distance and speed of the target obtained by the synaesthesia, so that a three-dimensional image with depth information and speed information can be obtained.
- the three-dimensional or four-dimensional point cloud data (including distance, speed, angle information) obtained by the synaesthesia is projected onto the visual image plane, so that the distance information and speed information can be assigned to the corresponding image pixels or the target in the image; finally, the fused image is transformed back to the three-dimensional space to obtain the corresponding three-dimensional image with depth information and speed information.
- the measurement quantity perceived by the synaesthesia class is speed and distance information (or point cloud data containing speed and distance information), and the measurement quantity perceived by the visual sensor is an image.
- Scenario 2 Fusion of synaesthesia perception and lidar.
- LiDAR can achieve ultra-high angular resolution by emitting ultra-narrow laser beams and receiving reflected echoes; at the same time, the optical frequency band has ultra-high bandwidth, which enables LiDAR to have ultra-high distance resolution.
- existing commercial LiDARs generally do not have speed measurement capabilities.
- Synaesthesia can measure distance, angle and speed. Usually, the number of antennas in a synaesthesia node is limited, which makes the angle measurement resolution very limited; at the same time, the bandwidth of the synaesthesia signal is much lower than the bandwidth of the optical frequency band signal. Then, synaesthesia can obtain better speed resolution performance through signal configuration.
- the fusion of LiDAR and synaesthesia can combine the advantages of both to obtain better distance measurement, angle measurement, and speed measurement performance.
- the corresponding points of the synaesthesia point cloud data and the LiDAR point cloud data can be associated to obtain fused point cloud data; the distance and angle information of each point in the fused point cloud data adopts the value in the LiDAR point cloud data, while the speed information of each point in the fused point cloud data adopts the value in the synaesthesia point cloud data; thus, the fused point cloud data has better angle measurement, distance measurement, and speed measurement performance at the same time.
- Scenario 3 Fusion of synaesthesia perception and millimeter-wave radar.
- Synesthesia and millimeter-wave radar are exactly the same in principle.
- the fusion of synesthesia and millimeter-wave radar is equivalent to the fusion of multi-link perception.
- the perceptual measurement quantity or the calculation result of the perceptual measurement quantity or the perceptual performance satisfies a preset condition, including at least one of the following:
- the power value of the perception target associated signal component satisfies the first threshold or has the largest power value; for example, the power value of the perception target associated signal component corresponding to the operation result (or other operation result) of the division or conjugate multiplication of the perception measurement quantities on the two receiving antennas/receiving channels satisfies the first threshold;
- the perceived SNR meets the second threshold or the perceived SNR is the largest
- the perceived SINR meets the third threshold or the perceived SINR is the largest
- At least Y sensing targets are detected
- the bitmap corresponding to the sensing target determined based on the detection is consistent with the preset bitmap configured by the network side device;
- the radar cross-sectional area RCS of the perceived target meets the first preset condition or the RCS is the largest; for example, the radar cross-sectional area RCS of the perceived target meets the first preset condition: for example, the first preset condition is that the RCS reaches X square meters, and X is a positive real number;
- the spectrum information of the perceived target meets the second preset condition; for example, the spectrum information of the perceived target meets the second preset condition: for example, the range-velocity spectrum of the perceived target meets the second preset condition, and the second preset condition at this time is that the perceived target can be distinguished on the range-velocity spectrum (the amplitude of a point or an area of the range-velocity spectrum reaches a preset value or has a maximum amplitude); or, the delay-Doppler spectrum of the perceived target meets the second preset condition, and the second preset condition at this time is that the perceived target can be distinguished on the delay-Doppler spectrum (the amplitude of a point or an area of the delay-Doppler spectrum reaches a preset value or has a maximum amplitude);
- the first parameter of the perceived target satisfies the third preset condition
- the first parameter includes at least one of the following: delay, distance, Doppler, speed, and angle information; for example, the second parameter of the perceived target satisfies the third preset condition: for example, the delay of the perceived target satisfies the third preset condition (for example, the delay satisfies an interval value); for another example, the distance of the perceived target satisfies the third preset condition (for example, the distance satisfies an interval value); for another example, the Doppler of the perceived target satisfies the third preset condition (for example, the Doppler satisfies an interval value); for another example, the speed of the perceived target satisfies the third preset condition (for example, the speed satisfies an interval value); for another example, the angle information of the perceived target satisfies the third preset condition (for example, the angle information satisfies an interval value);
- Y is a positive integer.
- the above-mentioned perceptual measurement quantity or the calculation result of the perceptual measurement quantity or the perceptual performance includes at least one of the following:
- A101 power value of the perceived target associated signal component
- it may be the power value of the sensing path.
- the power value of the perception target associated signal component is the power of the signal component that is greatly affected by the perception target in the received first signal, and can be at least one of the following:
- PRB physical resource block
- A1012 a power value calculated by taking the amplitude corresponding to the sample point with the largest amplitude in the inverse Fourier transform (IFFT) result (delay domain) of the frequency domain channel response of the received first signal as the target amplitude, or a power value calculated by taking the amplitudes corresponding to multiple sample points with the largest amplitudes as the target amplitude;
- IFFT inverse Fourier transform
- the power value is calculated by taking the amplitude corresponding to the sampling point with the largest amplitude within a specific time delay range as the target amplitude, or the power value is calculated by taking the amplitude corresponding to multiple sampling points with the largest amplitudes as the target amplitude.
- A1013, Fourier transform (FFT) result Doppler domain of the time domain channel response of the received first signal
- the power value is calculated by taking the amplitude corresponding to the sample point with the largest amplitude as the target amplitude, or the power value is calculated by taking the amplitude corresponding to multiple sample points with the largest amplitude as the target amplitude;
- the power value is calculated by taking the amplitude corresponding to the sample point with the largest amplitude within a specific Doppler range as the target amplitude, or the power value is calculated by taking the amplitude corresponding to multiple sample points with the largest amplitude as the target amplitude.
- A1014 A power value calculated by taking the two-dimensional Fourier transform result of the channel response of the received first signal, that is, the amplitude corresponding to the sample point with the largest amplitude in the delay-Doppler domain result as the target amplitude, or a power value calculated by taking the amplitudes corresponding to multiple sample points with the largest amplitudes as the target amplitude;
- the power value is calculated by taking the amplitude corresponding to the sampling point with the largest amplitude within a specific delay-Doppler range as the target amplitude, or the power value is calculated by taking the amplitude corresponding to multiple sampling points with the largest amplitude as the target amplitude.
- the maximum amplitude may also be an amplitude exceeding a specific threshold value, and the specific threshold value may be indicated by a network-side device or calculated by the terminal according to noise and/or interference power.
- the specific delay/Doppler range is related to the perception requirement, and may be indicated by the network side device, or may be obtained by the terminal according to the perception requirement.
- the power value of the perceived target associated signal component is the echo power
- the method for obtaining the echo signal power may be at least one of the following options:
- CFAR is performed based on the Doppler one-dimensional image obtained by slow time dimension FFT processing of the echo signal, and the maximum amplitude sample point of CFAR over the threshold is used as the target sample point, and its amplitude is used as the target signal amplitude, as shown in FIG8;
- the delay-Doppler two-dimensional map obtained by 2D-FFT processing of the echo signal is processed into CFAR, and the maximum amplitude sample point of CFAR over the threshold is used as the target sample point, and its amplitude is used as the target signal amplitude;
- the method of determining the target signal amplitude can also be to use the maximum amplitude sample point of CFAR over-threshold and the average of several of its nearest over-threshold sample points as the target signal amplitude.
- the perceived SNR may be a ratio of a power value of a perceived target associated signal component to a noise power.
- A103 a first signal to interference plus noise ratio (SINR);
- the perceived SINR may be a ratio of a power value of a perceived target associated signal component to a sum of powers of noise and interference.
- the method for acquiring the SNR/SINR may be:
- B21 Perform constant false alarm detection (CFAR) based on the time delay one-dimensional graph obtained by fast time dimension FFT processing of the echo signal. Take the maximum amplitude sample point that exceeds the threshold of CFAR as the target sample point, and its amplitude as the target signal amplitude. Take all the sample points in the one-dimensional graph that are ⁇ sample points away from the target sample point as interference/noise sample points, and count their average interference The interference/amplitude is the interference/noise signal amplitude, and finally the SNR/SINR is calculated based on the target signal amplitude and the interference/noise signal amplitude;
- CFAR constant false alarm detection
- the delay-Doppler two-dimensional graph obtained by 2D-FFT processing of the echo signal is subjected to CFAR.
- the sample point with the maximum amplitude that exceeds the threshold of CFAR is taken as the target sample point, and its amplitude is taken as the target signal amplitude.
- All sample points in the two-dimensional graph that are ⁇ (fast time dimension) and ⁇ (slow time dimension) away from the target sample point are taken as interference/noise sample points, and their average amplitude is counted as the interference/noise signal amplitude.
- the SNR/SINR is calculated based on the target signal amplitude and the interference/noise signal amplitude.
- the target signal amplitude can also be determined by using the maximum amplitude sample point of CFAR over-threshold and the average of several adjacent over-threshold sample points as the target signal amplitude;
- the interference/noise sample points can also be determined by further screening based on the interference/noise sample points determined above, and the screening method is: for the one-dimensional delay graph, remove several sample points near the delay of 0, and use the remaining interference/noise sample points as noise sample points; for the one-dimensional Doppler graph, remove several sample points near the Doppler of 0, and use the remaining interference/noise sample points as interference/noise sample points; for the two-dimensional delay-Doppler graph, remove the interference/noise sample points in the strip range composed of several points near the delay of 0 and the entire Doppler range, and use the remaining noise sample points as interference/noise sample points; for the three-dimensional delay-Doppler-angle graph, remove the interference/noise sample points in the slice range composed of several points near the time dimension 0, the entire Doppler range and the entire angle range, and use the remaining interference/noise sample points as interference/noise sample points.
- A104 sense whether the target exists
- This may include at least one of the following:
- A105 the number of targets that perceive the existence of the target
- This may include at least one of the following:
- the number of targets that are sensed within the preset range of distance or delay is the number of targets that are sensed within the preset range of distance or delay.
- A104 and A105 may be provided by other devices (for example, other Terminal, access network equipment or core network equipment) notifies the terminal.
- the method for judging whether there is a perception target can be: for example, whether there are sample points with amplitudes exceeding a specific threshold value in the delay/Doppler one-dimensional or two-dimensional graph. If so, it is considered that the perception target is detected; the number of sample points with amplitudes exceeding a specific threshold value in the delay/Doppler one-dimensional or two-dimensional graph is considered to be the number of perception targets.
- A106 radar cross-sectional area (RCS) information of the perceived target
- the RCS information may be the RCS information of a single perception target or the RCS information of multiple perception targets.
- the spectrum information may include at least one of the following: delay power spectrum, Doppler power spectrum, delay/distance-Doppler/velocity spectrum, angle power spectrum, delay/distance-angle spectrum, Doppler/velocity-angle spectrum, delay/distance-Doppler/velocity-angle spectrum.
- A109 the distance of at least one perceived target
- the sensing capability information is used to indicate the hardware and software capabilities of the sensing node in order to support the corresponding sensing service.
- the sensing capability information includes at least one of the following:
- sensor type perception is supported, and if sensor type perception is supported, at least one of the following: supported sensor types; the number of sensors corresponding to each supported sensor type; and perception capability information corresponding to each supported sensor.
- the sensor's sensing capability information includes at least one of the following:
- Supported sensing service types including at least one of the following:
- radar detection services further including: radar speed measurement, radar distance measurement, radar angle measurement, and radar imaging;
- 3D reconstruction services further including: terrain reconstruction, building surface reconstruction;
- weather and/or air quality detection services further including: rainfall detection, humidity detection, particulate matter (PM2.5/PM10) detection, and snowfall detection;
- health monitoring services further including: heart rate monitoring, breathing detection;
- Whether to support motion recognition services further including: gesture recognition, posture recognition, and intrusion detection.
- the QoS of the measurement quantity includes the following: One less item:
- Perception resolution including at least one of the following: ranging (or delay) resolution, velocity (or Doppler) resolution, angle (azimuth, pitch) resolution, imaging resolution, acceleration (X/Y/Z directions) resolution, angular velocity (around X/Y/Z axes) resolution;
- Perception accuracy including at least one of the following: ranging (or delay) accuracy, velocity (or Doppler) accuracy, angle (azimuth, pitch) accuracy, acceleration (X/Y/Z directions) accuracy, angular velocity (around X/Y/Z axes) accuracy;
- the sensing range includes at least one of the following: distance (or delay) measurement range, velocity (or Doppler) measurement range, acceleration (X/Y/Z directions) measurement range, angular velocity (around X/Y/Z axes) measurement range, and imaging range;
- Perception latency (the time interval from the sending of the first signal to obtaining the perception result, or the time interval from the initiation of the perception demand to obtaining the perception result);
- Perception update rate (the time interval between two consecutive perception operations and the acquisition of perception results);
- Detection probability the probability of correctly detecting the perceived object when it exists
- False alarm probability the probability of erroneously detecting a perceived target when the perceived object does not exist
- Coverage The spatial extent of the sensed target/imaging area that meets at least one of the above performance requirements.
- first-level measurement quantity (received signal/original channel information), the first-level measurement quantity includes: at least one of the received signal/channel response complex result, amplitude/phase, I path/Q path and their operation results;
- the operations include addition, subtraction, multiplication and division, matrix addition, subtraction and multiplication, matrix transposition, trigonometric relationship operations, square root operations and power operations, as well as at least one of the threshold detection results and maximum/minimum value extraction results of the above operation results; the operations also include fast Fourier transform (Fast Fourier Transform, FFT)/inverse fast Fourier transform (Inverse Fast Fourier Transform, IFFT), discrete Fourier transform (Discrete Fourier Transform, DFT)/inverse discrete Fourier transform (Inverse Discrete Fourier Transform, IDFT), 2D-FFT, 3D-FFT, matched filtering, autocorrelation operation, wavelet transform and digital filtering, as well as at least one of the threshold detection results and maximum/minimum value extraction results of the above operation results;
- Second-level measurement quantity (basic measurement quantity), which may include: delay, Doppler, angle, intensity, and at least one of their multi-dimensional combination representations;
- Level 3 measurement (basic attributes/states), which may include at least one of: distance, speed, orientation, spatial position, and acceleration;
- Level 4 measurement may include at least one of the following: target presence, trajectory, movement, expression, vital signs, quantity, imaging results, weather, air quality, shape, material, and composition.
- the above-mentioned perception measurement quantity also includes label information corresponding to the perception measurement quantity, and the label information may include the following: At least one of the following:
- sensing service information such as sensing service identification (ID).
- the purpose of the measurement for example, communication, perception, synaesthesia, etc.;
- Perceived node information such as terminal ID, node location, device orientation, etc.
- Perceiving link information such as link sequence number, transmitting and receiving node identification, etc.
- the sensing link information includes: an identifier of a receiving antenna or a receiving channel. If it is a sensing measurement of a single receiving antenna or a receiving channel, the identifier is the identifier of the receiving antenna or the receiving channel; if it is a result of a division or conjugate multiplication of two receiving antennas or receiving channels, the identifier is the identifier of the two receiving antennas or receiving channels, and the identifier of the division or conjugate multiplication.
- the form of the measurement quantity such as an amplitude value, a phase value, a complex value combining amplitude and phase
- the resource type of the measurement quantity such as a time domain measurement result, a frequency domain resource measurement result
- Measurement indicator information such as SNR and perceived SNR.
- Mn Neighboring cell measurement result, without considering any offset
- Ocn Neighboring cell-level specific offset
- Mp SpCell (primary serving cell) measurement result, without considering any offset
- Ocp SpCell cell-level specific offset
- Hys hysteresis parameter of the event
- Off Offset parameter of the event.
- the base station configures the timeToTrigger parameter for each event in CondTriggerConfig.
- the UE uses the cells that meet the conditions as trigger cells and selects one of the trigger cells to perform conditional reconfiguration.
- an embodiment of the present application provides a mobility management device, which is applied to a first node.
- the device 1000 includes:
- a measurement module 1001 is used for the first node to perform a sensing measurement and obtain a measurement result
- the first sending module 1002 is used to send a measurement report to the second node, where the measurement report includes the measurement result, and the measurement result is used to determine whether to perform perception switching; wherein the measurement result includes at least one of the following: a measurement quantity perceived by the sensor; a fused measurement result, where the fused measurement result is obtained by fusion processing of a measurement quantity perceived by the sensor and a measurement quantity perceived by the synaesthesia class.
- the measurement result also includes a measurement of synaesthesia perception.
- the measurement of synaesthesia perception includes at least one of the following:
- the first-level measurement quantity includes: received signal/channel response complex result, amplitude/phase, I At least one of the path/Q path and its operation result;
- the second-level measurement quantity includes: at least one of delay, Doppler, angle, and intensity;
- the third-level measurement quantity includes: at least one of distance, speed, direction, spatial position, and acceleration;
- the fourth-level measurement quantity includes at least one of the following: whether the target exists, trajectory, action, expression, vital sign, quantity, imaging result, weather, air quality, shape, material, and composition.
- the device further includes:
- a first receiving module configured to receive a switching command, wherein the switching command is used to instruct the first node to perform sensing as a target sensing node;
- a perception module is used to perform perception according to the switching command.
- the perception mode of the target perception node is the same as or different from the perception mode of the source perception node, and the perception mode includes: at least one of sensor-type perception and synaesthesia-type perception, and the synaesthesia-type perception includes at least one of downlink perception, uplink perception, terminal echo perception, base station echo perception, air interface perception between base stations, and air interface perception between terminals.
- the device further includes:
- the second receiving module is used to receive measurement configuration information sent by the second node, where the measurement configuration information is used to configure the first node to perform perception measurement.
- the measurement configuration information includes at least one of the following:
- Measurement events where the measurement events include at least one of the following: a sensed measurement quantity or a calculation result of a sensed measurement quantity or a sensed performance meets a preset condition; a state of a sensed target changes; or a position of a sensed node involved in the sensed event changes.
- the measurement quantity sensed by the sensor includes at least one of the following:
- Measurement quantities including at least one of the following: target presence, trajectory, movement, expression, vital signs, quantity, imaging results, weather, air quality, shape, material, and composition.
- the laser radar-related measurement includes at least one of the following: laser radar point cloud data, angle and/or distance of a target obtained based on the laser radar point cloud data, visual features of a target identified from the laser radar point cloud data, and the number of targets identified from the laser radar point cloud data;
- the vision-related measurement includes at least one of the following: a visual image, the luminosity of an image pixel, the RGB value of an image pixel, a visual feature of an object identified from the image, an angle and/or distance of an object identified from the image, and the number of objects identified from the image;
- the radar-related measurements include at least one of the following: radar point cloud, distance, speed, and/or angle of identified targets, radar imaging, and number of targets;
- the measurement quantity related to the inertial measurement unit includes at least one of the following: acceleration, velocity, and angular velocity.
- the second sending module is used to send first information to the second node, where the first information includes at least one of the following: sensing node information; sensing capability information of the sensing node; and sensing authority information.
- the first node includes a terminal, and the second node includes a source base station and/or a perception function network element, or the first node includes: a source base station and/or a candidate target base station, and the second node includes a perception function network element; or the first node includes a source terminal and/or a candidate target terminal, and the second node includes a perception function network element; or the first node includes a candidate target base station, and the second node includes a source base station; or the first node includes: an application function and/or an application server, and the second node includes a source base station and/or a perception function network element.
- the device provided in the embodiment of the present application can implement each process implemented in the method embodiment of Figure 4 and achieve the same technical effect. To avoid repetition, it will not be repeated here.
- an embodiment of the present application provides a mobility management device, which is applied to a second node.
- the device 1100 includes:
- the third receiving module 1101 is configured to receive a measurement report sent by the first node, where the measurement report includes a measurement result;
- the third sending module 1102 is configured to, when the second node determines to perform perception switching according to the measurement result, send first request information by the second node, where the first request information is used to request the third node to perform perception;
- the third receiving module 1103 is used to receive first response information sent by the third node, where the first response information is used to indicate that the third node agrees to perform the sensing;
- the fourth sending module 1104 is used to send a switching command to the third node, where the switching command is used to instruct the third node to perform sensing as a target sensing node;
- the measurement result includes at least one of the following: a measurement quantity perceived by a sensor; and a fused measurement result, wherein the fused measurement result is obtained by fusion processing of the measurement quantity perceived by the sensor and the measurement quantity perceived by the synaesthesia class.
- the measurement result also includes a measurement of synaesthesia perception.
- Perception measures include at least one of the following:
- the first-level measurement quantity includes: at least one of a received signal/channel response complex result, amplitude/phase, I path/Q path and a calculation result thereof;
- the second-level measurement quantity includes: at least one of delay, Doppler, angle, and intensity;
- the third-level measurement quantity includes: at least one of distance, speed, direction, spatial position, and acceleration;
- the fourth-level measurement quantity includes at least one of the following: whether the target exists, trajectory, action, expression, vital sign, quantity, imaging result, weather, air quality, shape, material, and composition.
- the perception mode of the target perception node is the same as or different from the perception mode of the source perception node; the perception mode includes: at least one of sensor-type perception and synaesthesia-type perception, and the synaesthesia-type perception includes at least one of downlink perception, uplink perception, terminal echo perception, base station echo perception, air interface perception between base stations, and air interface perception between terminals.
- the device further includes:
- An acquisition module used for acquiring first information of a sensing node
- the first information includes at least one of the following: information of the sensing node; sensing capability information of the sensing node; and sensing authority information.
- the device further includes:
- the fifth sending module is used to send measurement configuration information to the first node, where the measurement configuration information is used to configure the first node to perform perception measurement.
- the measurement configuration information includes at least one of the following:
- Measurement events where the measurement events include at least one of the following: a sensed measurement quantity or a calculation result of a sensed measurement quantity or a sensed performance meets a preset condition; a state of a sensed target changes; or a position of a sensed node involved in the sensed event changes.
- the measurement quantity sensed by the sensor includes at least one of the following:
- Measurement quantities including at least one of the following: target presence, trajectory, movement, expression, vital signs, quantity, imaging results, weather, air quality, shape, material, and composition.
- the laser radar-related measurement includes at least one of the following: laser radar point cloud data, angle and/or distance of a target obtained based on the laser radar point cloud data, visual features of a target identified from the laser radar point cloud data, and the number of targets identified from the laser radar point cloud data;
- the vision-related measurement includes at least one of the following: a visual image, the luminosity of an image pixel, the RGB value of an image pixel, a visual feature of an object identified from the image, an angle and/or distance of an object identified from the image, and the number of objects identified from the image;
- the radar-related measurements include at least one of the following: radar point cloud, distance, speed, and/or angle of identified targets, radar imaging, and number of targets;
- the measurement quantity related to the inertial measurement unit includes at least one of the following: acceleration, velocity, and angular velocity.
- the first node includes a terminal, and the second node includes a source base station and/or a perception function network element, or the first node includes: a source base station and/or a candidate target base station, and the second node includes a perception function network element; or the first node includes a source terminal and/or a candidate target terminal, and the second node includes a perception function network element; or the first node includes a candidate target base station, and the second node includes a source base station; or the first node includes: an application function and/or an application server, and the second node includes a source base station and/or a perception function network element.
- the device provided in the embodiment of the present application can implement each process implemented in the method embodiment of Figure 5 and achieve the same technical effect. To avoid repetition, it will not be repeated here.
- FIG. 12 is a schematic diagram of the hardware structure of a terminal implementing the embodiment of the present application.
- the terminal 1200 includes but is not limited to: a radio frequency unit 1201, a network module 1202, an audio output unit 1203, an input unit 1204, a sensor 1205, a display unit 1206, a user input unit 1207, an interface unit 1208, a memory 1209 and at least some of the components of the processor 1210.
- the terminal 1200 may also include a power source (such as a battery) for supplying power to each component, and the power source may be logically connected to the processor 1210 through a power management system, so as to implement functions such as charging, discharging, and power consumption management through the power management system.
- a power source such as a battery
- the terminal structure shown in FIG12 does not constitute a limitation on the terminal, and the terminal may include more or fewer components than shown in the figure, or combine certain components, or arrange components differently, which will not be described in detail here.
- the RF unit 1201 can transmit the data to the processor 1210 for processing; in addition, the RF unit 1201 can send uplink data to the network side device.
- the RF unit 1201 includes but is not limited to an antenna, an amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, etc.
- the memory 1209 can be used to store software programs or instructions and various data.
- the memory 1209 may mainly include a first storage area for storing programs or instructions and a second storage area for storing data, wherein the first storage area may store an operating system, an application program or instruction required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
- the memory 1209 may include a volatile memory or a non-volatile memory, or the memory 1209 may include both volatile and non-volatile memories.
- the non-volatile memory may be a read-only memory (ROM), a programmable read-only memory (PROM), an erasable programmable read-only memory (EPROM), an electrically erasable programmable read-only memory (EEPROM), or a flash memory.
- the volatile memory may be a random access memory (RAM), a static random access memory (SRAM), a dynamic random access memory (DRAM), a synchronous dynamic random access memory (SDRAM), a double data rate synchronous dynamic random access memory (DDRSDRAM), an enhanced synchronous dynamic random access memory (ESDRAM), a synchronous link dynamic random access memory (SLDRAM) and a direct memory bus random access memory (DRRAM).
- the memory 1209 in the embodiment of the present application includes but is not limited to these and any other suitable types of memory.
- the processor 1210 may include one or more processing units; optionally, the processor 1210 integrates an application processor and a modem processor, wherein the application processor mainly processes operations related to an operating system, a user interface, and application programs, and the modem processor mainly processes wireless communication signals, such as a baseband processor. It is understandable that the modem processor may not be integrated into the processor 1210.
- an embodiment of the present application further provides a network side device 1300, including a processor 1301 and a memory 1302, and the memory 1302 stores a program or instruction that can be executed on the processor 1301.
- the network side device includes a source base station, a candidate target base station application function and/or an application server
- the program or instruction can be executed by the processor 1301 to implement the various steps of the method embodiment of Figure 4 above, and can achieve the same technical effect
- the network side device includes a source base station and/or a perception function network element
- the program or instruction can be executed by the processor 1301 to implement the various steps of the method embodiment of Figure 5 above, and can achieve the same technical effect.
- I will not go into details here.
- An embodiment of the present application also provides a readable storage medium, on which a program or instruction is stored.
- a program or instruction is stored.
- the method of Figure 4 or Figure 5 and the various processes of the above-mentioned embodiments are implemented, and the same technical effect can be achieved. To avoid repetition, it will not be repeated here.
- the processor is the processor in the communication device described in the above embodiment.
- the readable storage medium includes a computer readable storage medium, such as a computer read-only memory ROM, a random access memory RAM, a magnetic disk or an optical disk.
- An embodiment of the present application further provides a chip, which includes a processor and a communication interface, wherein the communication interface is coupled to the processor, and the processor is used to run programs or instructions to implement the various processes shown in Figure 4 or Figure 5 and the various method embodiments mentioned above, and can achieve the same technical effect. To avoid repetition, it will not be repeated here.
- the chip mentioned in the embodiments of the present application can also be called a system-level chip, a system chip, a chip system or a system-on-chip chip, etc.
- An embodiment of the present application further provides a communication system, which includes a terminal and a network-side device.
- the terminal is used to execute the various processes as shown in Figure 4 and the various method embodiments described above
- the network-side device is used to execute the various processes as shown in Figure 4 or Figure 5 and the various method embodiments described above, and can achieve the same technical effect. In order to avoid repetition, it will not be repeated here.
- the technical solution of the present application can be embodied in the form of a computer software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), and includes a number of instructions for a terminal (which can be a mobile phone, computer, server, air conditioner, or network side device, etc.) to execute the methods described in each embodiment of the present application.
- a storage medium such as ROM/RAM, magnetic disk, optical disk
- a terminal which can be a mobile phone, computer, server, air conditioner, or network side device, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
本申请公开了一种移动性管理方法、装置、通信设备及可读存储介质,该方法包括:第一节点进行感知测量,得到测量结果;第一节点向第二节点发送测量报告,测量报告包括测量结果,测量结果用于确定是否进行感知切换;其中,测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
Description
相关申请的交叉引用
本申请主张在2022年12月23日在中国提交的中国专利申请No.202211667944.5的优先权,其全部内容通过引用包含于此。
本申请属于通信技术领域,具体涉及一种移动性管理方法、装置、通信设备及可读存储介质。
未来移动通信系统例如超第五代移动通信系统(Beyond fifth-generation,B5G)系统或第六代移动通信技术(sixth-generation,6G)系统除了具备通信能力外,还将具备感知能力。感知能力,即具备感知能力的一个或多个设备,能够通过无线信号的发送和接收,来感知目标物体的方位、距离、速度等信息,或者对目标物体、事件或环境等进行检测、跟踪、识别、成像等。
在通信感知一体化中,尚无基于传感器感知的感知移动性管理流程的具体实现方法,从而无法保持感知业务的连续性。
发明内容
本申请实施例提供一种移动性管理方法、装置、通信设备及可读存储介质,解决如何保持感知业务的连续性的问题。
第一方面,提供一种移动性管理方法,包括:
第一节点进行感知测量,得到测量结果;
所述第一节点向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换;
其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
第二方面,提供一种移动性管理方法,包括:
第二节点接收第一节点发送的测量报告,所述测量报告包括测量结果;
所述第二节点发送第一请求信息,所述第一请求信息用于请求第三节点进行感知;
所述第二节点接收所述第三节点发送的第一应答信息,所述第一应答信息用于指示所述第三节点同意执行感知;
所述第二节点向所述第三节点发送切换命令,所述切换命令用于指示所述第三节点作为目标感知节点执行感知;
其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
第三方面,提供一种移动性管理装置,包括:
测量模块,用于第一节点进行感知测量,得到测量结果;
第一发送模块,用于向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
第四方面,提供一种移动性管理装置,包括:
第三接收模块,用于接收第一节点发送的测量报告,所述测量报告包括测量结果;
第三发送模块,用于向第三节点发送第一请求信息,所述第一请求信息用于请求所述第三节点进行感知;
第三接收模块,用于接收第三节点发送的第一应答信息,所述第一应答信息用于指示所述第三节点同意执行感知;
第四发送模块,用于向所述第三节点发送切换命令,所述切换命令用于指示所述第三节点作为目标感知节点执行感知;
其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
第五方面,提供了一种通信设备,包括:处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如第一方面或第二方面所述的方法的步骤。
第六方面,提供了一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如第一方面或第二方面所述的方法的步骤。
第七方面,提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现如第一方面或第二方面所述法的步骤。
第八方面,提供了一种计算机程序/程序产品,所述计算机程序/程序产品被存储在非瞬态的存储介质中,所述计算机程序/程序产品被至少一个处理器执行以实现如第一方面或第二方面所述的方法的步骤。
第九方面,提供一种通信系统,所述通信系统包括终端与网络侧设备,所述终端用于执行如第一方面所述的方法的步骤,所述网络侧设备用于执行如第一方面或第二方面所述的方法的步骤。
在本申请的实施例中,第一节点进行感知测量,得到测量结果;所述第一节点向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的,实现
基于传感器感知的感知移动性管理,可以确保感知业务的连续性,提升感知业务的用户体验。
图1是通信感知一体化的示意图;
图2是切换流程的示意图;
图3是感知分类的示意图;
图4是本申请的一种实施例提供的一种移动性管理方法的示意图;
图5是本申请的另一种实施例提供的一种移动性管理方法的示意图;
图6是源基站和UE进行下行感知,切换为目标基站和UE进行下行感知的示意图;
图7是源基站和源UE进行上行感知,切换为目标基站和目标UE进行上行感知的示意图;
图8是本申请的又一种实施例提供的一种移动性管理方法的示意图;
图9是一维图SNR计算示意图;
图10是本申请的一种实施例提供的一种移动性管理装置的结构图;
图11是本申请的另一种实施例提供的一种移动性管理装置的结构图;
图12是本申请实施例提供的终端的示意图;
图13是本申请实施例提供的网络侧设备的示意图。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书中的术语“第一”、“第二”等是用于区别类似的对象,而不用于描述特定的顺序或先后次序。应该理解这样使用的术语在适当情况下可以互换,以便本申请的实施例能够以除了在这里图示或描述的那些以外的顺序实施,且“第一”、“第二”所区别的对象通常为一类,并不限定对象的个数,例如第一对象可以是一个,也可以是多个。此外,说明书以及权利要求中“和/或”表示所连接对象的至少其中之一,字符“/”一般表示前后关联对象是一种“或”的关系。
值得指出的是,本申请实施例所描述的技术不限于长期演进型(Long Term Evolution,LTE)/LTE的演进(LTE-Advanced,LTE-A)系统,还可用于其他无线通信系统,诸如码分多址(Code Division Multiple Access,CDMA)、时分多址(Time Division Multiple Access,TDMA)、频分多址(Frequency Division Multiple Access,FDMA)、正交频分多址(Orthogonal Frequency Division Multiple Access,OFDMA)、单载波频分多址(Single-carrier Frequency Division Multiple Access,SC-FDMA)和其他系统。本申请实施例中的术语“系统”和“网络”
常被可互换地使用,所描述的技术既可用于以上提及的系统和无线电技术,也可用于其他系统和无线电技术。以下描述出于示例目的描述了新空口(New Radio,NR)系统,并且在以下大部分描述中使用NR术语,但是这些技术也可应用于NR系统应用以外的应用,如第6代(6th Generation,6G)通信系统。
为了便于理解本申请实施例,下面先介绍以下技术点:
一、关于通信感知一体化。
未来移动通信系统例如超5代移动通信系统或第6代移动通信系统除了具备通信能力外,还将具备感知能力。感知能力,即具备感知能力的一个或多个设备,能够通过无线信号的发送和接收,来感知目标物体的方位、距离、速度等信息,或者对目标物体、事件或环境等进行检测、跟踪、识别、成像等。未来随着毫米波、太赫兹等具备高频段大带宽能力的小基站在6G网络的部署,感知的分辨率相比厘米波将明显提升,从而使得6G网络能够提供更精细的感知服务。典型的感知功能与应用场景如表1所示。
表1:典型的感知功能与应用场景。
通信感知一体化(简称通感一体化)即在同一系统中通过频谱共享与硬件共享,实现通信、感知功能一体化设计,系统在进行信息传递的同时,能够感知方位、距离、速度等信息,对目标设备或事件进行检测、跟踪、识别,通信系统与感知系统相辅相成,实现整体性能上的提升并带来更好的服务体验。
通信与雷达的一体化属于典型的通信感知一体化(通信感知融合)应用,在过去,雷达系统与通信系统由于研究对象与关注重点不同而被严格地区分,大部分场景下两系统被独立的研究。事实上,雷达与通信系统同样作为信息发送、获取、处理和交换的典型方式,不论工作原理还是系统架构以及频段上存在着不少相似之处。通信与雷达一体化的设计具有较大的可行性,主要体现在以下几个方面:首先,通信系统与感知系统均基于电磁波理论,利用电磁波的发射和接收来完成信息的获取和传递;其次,通信系统与感知系统均具备天线、发送端、接收端、信号处理器等结构,在硬件资源上有很大重叠;随着技术的发
展,两者在工作频段上也有越来越多的重合;另外,在信号调制与接收检测、波形设计等关键技术上存在相似性。通信与雷达系统融合能够带来许多优势,例如节约成本、减小尺寸、降低功耗、提升频谱效率、减小互干扰等,从而提升系统整体性能。
根据第一信号发送节点和接收节点的不同,分为6种基本感知方式,如图1所示,具体包括:
(1)基站回波感知。在这种感知方式下,基站A发送第一信号,并通过接收该第一信号的回波来进行感知测量。
(2)基站间空口感知。基站B接收基站A发送的第一信号,进行感知测量。
(3)上行空口感知。基站A接收终端A发送的第一信号,进行感知测量。
(4)下行空口感知。终端B接收基站B发送的第一信号,进行感知测量。
(5)终端回波感知。终端A发送第一信号,并通过接收该第一信号的回波来进行感知测量。
(6)终端间旁链路(Sidelink,SL)感知。终端B接收终端A发送的第一信号,进行感知测量。
值得注意的是,图1中每种感知方式都以一个第一信号发送节点和一个第一信号接收节点作为例子,实际系统中,可以根据不同的感知用例和感知需求选择一种或多种不同的感知方式,且每种感知方式的发送节点和接收节点可以有一个或多个。图1中的感知目标以人和车作为示例,且假设人和车均没有携带或安装信号收/发设备,实际场景的感知目标更加丰富。
本文中的第一信号包括参考信号、同步信号、数据信号和专用信号的至少一项。通过接收和/或发送第一信号可以支持感知业务,例如通过接收和/或发送第一信号可得到感知测量量或者感知结果,感知结果是指满足感知需求的结果,例如:感知目标的形状,二维(2-dimention,2D)/3D环境重构,空间位置,朝向,位移,移动速度,加速度;雷达类感知的对目标对象的测速测距测角/成像;人/物是否存在;感知目标如人的动作,手势,呼吸频率,心跳频率,睡眠质量等。
所述第一信号可以是不包含传输信息的信号,如现有的LTE/NR同步和参考信号,包括同步信号和物理广播信道(Synchronization Signal and PBCH block,SSB)信号、信道状态信息参考信号(Channel State Information-Reference Signal,CSI-RS)、解调参考信号(Demodulation Reference Signal,DMRS)、信道探测参考信号(Sounding Reference Signal,SRS)、定位参考信号(Positioning Reference Signal,PRS)、相位追踪参考信号(Phase Tracking Reference Signal,PTRS)等;也可以是雷达常用的单频连续波(Continuous Wave,CW)、调频连续波(Frequency Modulated CW,FMCW),以及超宽带高斯脉冲等;还可以是新设计的专用信号,具有良好的相关特性和低峰均功率比,或者新设计的通感一体化信号,既承载一定信息,同时具有较好的感知性能。例如,该新信号为至少一种专用第一信号/参考信号,和至少一种通信信号在时域和/或频域上拼接/组合/叠加而成。
二:关于切换。
切换是连接状态下终端的移动触发,切换的基本目标:指示终端可与比当前服务小区信道质量更好的小区通信;为终端提供连续的无中断的通信服务,有效防止由于小区的信号质量变差造成的掉话。
5G中的切换的流程包含以下几个步骤(与长期演进(long Term Evolution,LTE)基本类似):
步骤1:触发测量:在终端完成接入或切换成功后,基站会通过无线资源控制(Radio Resource Control,RRC)连接重配置(Connection Reconfiguration)向终端下发测量控制信息。此外,若测量配置信息有更新,基站也会通过RRC连接重配置消息下发更新的测量控制信息。测量控制信息中最主要的就是下发测量对象、测量报告(Measurement Report,MR)配置、测量事件等。
表2:测量事件。
表3:测量事件具体判定准则。
其中,Ms表示服务小区的测量结果;
Mn表示邻区的测量结果;
TimeToTrig表示持续满足事件进入条件的时长,即时间迟滞;
Off表示测量结果的偏置,步长0.5db;
Hys表示测量结果的幅度迟滞,步长0.5db;
Ofs表示服务小区的频率偏置;
Ofn表示邻区的频率偏置;
Ocs表示服务小区特定偏置(CellIndividualOffset,CIO);
Ocn表示系统内邻区的CIO;
Thresh即对应事件配置的门限值。
5G NR中的事件取值范围与LTE有所区别。取值(Range)即对应测量报告中上报的数值,而单位(Value)则对应其真实的数值。以RSRP为例,LTE中的Value值范围为-140~-44dBm,而NR为-156~-31dBm,NR允许更高及更低的接收电平。LTE实际值为MR上报值-140,而NR则为MR上报值-156。假设MR中上报的参考信号接收功率(Reference Signal Receiving Power,RSRP)为50,则实际值为50-140=-90dBm。而NR中RSRP的实际值则为50-156=-106dBm。
步骤2:执行测量:根据测量控制的相关配置,终端监测无线信道,但满足测量报告条件时,通过事件报告基站。测量报告数量/事件的触发可以是RSRP、参考信号接收质量(Reference Signal Received Quality,RSRQ)或信号与干扰加噪声比(Signal to Interference plus Noise Ratio,SINR)。
步骤3:目标判决:基站以测量为基础资源,按照先上报先处理的方式选择切换小区,并选择相应的切换策略(如切换和重定向)。
步骤4:切换执行:源基站向目标基站进行资源的申请与分配,而后源基站进行切换执行判决,将切换命令下发给终端,终端执行切换和数据转发。
三、关于感知的分类。
根据感知节点(指具有感知能力的设备,例如传感器等)与感知目标或感知对象是否分离,感知可以分为非接触式感知和接触式感知。接触式感知需要在感知目标上安装温度计/湿度计/气压计/陀螺仪/加速度计/重力传感器等传感器来采集特定感知信息;非接触式感知需要光、声或无线电波等作为感知传导的媒介,因此可以分为光感知(例如通过可见光、红外线等作为媒介进行感知的普通摄像头,红外摄像头,激光雷达等)、声感知(例如通过机械波、超声波进行感知的声呐)和无线电波感知(例如毫米波雷达等)。图1中示意了不同的感知类型。
另外一种分类方式是将感知分为非射频感知和射频感知,其中非射频感知包括接触式感知,光感知和声感知。非射频感知在某些方面存在一定局限性,例如基于可见光的普通摄像头系统往往依赖于光线条件或视距条件,尤其在室外场景受天气情况影响严重,且存在隐私安全问题等。其他特定传感器需要安装在感知目标上,存在部署维护与设备供电的问题,在便利性和成本方面有一定局限性。射频感知即无线电波感知。无线电波信号在传
播过程中受到周围环境的影响,信号的幅度、相位等特征发生了变化,接收端通过对无线信号特征变化的分析,不仅能够得到无线信号承载的发送端信息,还能够提取出反映传播环境特征的信息。射频感知支持感知节点与感知目标的分离,相比非射频感知,其受天气和光线条件影响较小,感知范围相对更大,灵活性更高,在隐私与安全方面相比摄像头更具优势。射频感知的典型应用是雷达,包括军事领域应用的雷达和商用的毫米波雷达等。
如图3所示,通信感知一体化中的感知属于射频感知。从面向6G的角度,本申请将各种类型的感知分为通信感知一体化类感知(下文简称,通感类感知)和传感器类感知,其中传感器类感知包括除通感类感知以外的其他所有感知方式。通感类感知与传感器类感知互为补充。6G未来将综合利用包括通感类感知和传感器类感知在内的不同类型的感知来采集各类数据,以实现对物理世界的更便捷更高效更精确的数字化构建。
本申请中的感知功能(Sensing Function)网元,也可以称为感知网元或者感知网络功能,可以处于无线接入网(Radio Access Network,RAN)侧或核心网侧,即可以是核心网和/或RAN中负责感知请求处理、感知资源调度、感知信息交互、感知数据处理等至少一项功能的网络节点,比如可以是基于5G网络中接入和移动管理功能(Access and Mobility Management Function,AMF)或位置管理功能(Location Management Function,LMF)升级,也可以是其他网络节点或新定义的网络节点,具体的,感知功能网元的功能特性可以包括以下至少一项:
(1)与无线信号发送设备和/或无线信号测量设备(包括目标终端或者目标终端的服务基站或者目标区域关联的基站)进行目标信息交互,其中,目标信息包括感知处理请求,感知能力,感知辅助数据,感知测量量类型,感知资源配置信息等,以获得无线信号测量设备发送目标感知结果或感知测量量(上行测量量或下行测量量)的值;其中,无线信号也可以称作第一信号。
(2)根据感知业务的类型、感知业务消费者信息、所需的感知服务质量(Quality of Service,QoS)要求信息、无线信号发送设备的感知能力、无线信号测量设备的感知能力等因素来决定使用的感知方法,该感知方法可以包括:基站A发基站B收,或者基站发终端收,或者基站A自发自收,或者终端发基站收,或者终端自发自收,或者终端A发终端B收等。
(3)根据感知业务的类型、感知业务消费者的信息、所需的感知QoS要求信息、无线信号发送设备的感知能力、无线信号测量设备的感知能力等因素,来决定为感知业务服务的感知设备,其中,感知设备包括无线信号发送设备和/或无线信号测量设备。
(4)管理感知业务所需资源的整体协调和调度,如对基站和/或终端的感知资源进行相应的配置;
(5)对感知测量量的值进行数据处理,或进行计算获得感知结果。进一步地,验证感知结果,估计感知精度等。
本申请中涉及的终端可以是手机、平板电脑(Tablet Personal Computer)、膝上型电脑
(Laptop Computer)或称为笔记本电脑、个人数字助理(Personal Digital Assistant,PDA)、掌上电脑、上网本、超级移动个人计算机(ultra-mobile personal computer,UMPC)、移动上网装置(Mobile Internet Device,MID)、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备、机器人、可穿戴式设备(Wearable Device)、车载设备(Vehicle User Equipment,VUE)、行人终端(Pedestrian User Equipment,PUE)、智能家居(具有无线通信功能的家居设备,如冰箱、电视、洗衣机或者家具等)、游戏机、个人计算机(personal computer,PC)、柜员机或者自助机等终端侧设备,可穿戴式设备包括:智能手表、智能手环、智能耳机、智能眼镜、智能首饰(智能手镯、智能手链、智能戒指、智能项链、智能脚镯、智能脚链等)、智能腕带、智能服装、游戏机等。需要说明的是,在本申请实施例并不限定终端的具体类型。
本申请中涉及的核心网设备可以包含但不限于如下至少一项:核心网节点、核心网功能、移动管理实体(Mobility Management Entity,MME)、AMF、LMF、会话管理功能(Session Management Function,SMF)、用户平面功能(User Plane Function,UPF)、策略控制功能(Policy Control Function,PCF)、策略与计费规则功能单元(Policy and Charging Rules Function,PCRF)、边缘应用服务发现功能(Edge Application Server Discovery Function,EASDF)、统一数据管理(Unified Data Management,UDM),统一数据仓储(Unified Data Repository,UDR)、归属用户服务器(Home Subscriber Server,HSS)、集中式网络配置(Centralized network configuration,CNC)、网络存储功能(Network Repository Function,NRF),网络开放功能(Network Exposure Function,NEF)、本地NEF(Local NEF,或L-NEF)、绑定支持功能(Binding Support Function,BSF)、应用功能(Application Function,AF)等。需要说明的是,在本申请实施例中仅以NR系统中的核心网设备为例进行介绍,并不限定核心网设备的具体类型。
本申请中涉及的网络侧设备可以包括接入网设备或核心网设备,其中,接入网设备也可以称为无线接入网设备、无线接入网(Radio Access Network,RAN)、无线接入网功能或无线接入网单元。接入网设备可以包括基站、中继站、无线局域网(Wireless Local Area Network,WLAN)接入点或WiFi节点等,基站可被称为节点B(Node B,NB)、演进节点B(Evolved Node B,eNB)、接入点、基收发机站(Base Transceiver Station,BTS)、无线电基站、无线电收发机、基本服务集(Basic Service Set,BSS)、扩展服务集(Extended Service Set,ESS)、发送接收点(Transmission Reception Point,TRP)或所述领域中其他某个合适的术语,只要达到相同的技术效果,所述基站不限于特定技术词汇,需要说明的是,在本申请实施例中仅以NR系统中的基站为例进行介绍,并不限定基站的具体类型。
本申请中的第一信号也可以称为感知信号或通感一体化信号,即通过发射和/或接收该第一信号可以支持感知业务,例如通过发射和/或接收该第一信号可得到感知测量量或者感知结果。
所述第一信号可以是不包含传输信息的信号,如现有的LTE/NR同步和参考信号,或
者,第一信号可以是包括SSB信号、信道状态信息参考信号(Channel State Information-Reference Signal,CSI-RS)、解调参考信号(Demodulation Reference Signal,DMRS)、信道探测参考信号(Sounding Reference Signal,SRS)、定位参考信号(Positioning Reference Signal,PRS)、相位追踪参考信号(Phase Tracking Reference Signal,PTRS)等中的至少一种;或者,第一信号也可以是雷达常用的单频连续波(Continuous Wave,CW)、调频连续波(Frequency Modulated CW,FMCW),以及超宽带高斯脉冲等;或者,第一信号还可以是新设计的专用信号,具有良好的相关特性和低峰均功率比,或者新设计的通感一体化信号,既承载一定信息,同时具有较好的感知性能。例如,该新信号为至少一种专用感知信号/参考信号,和至少一种通信信号在时域和/或频域上拼接/组合/叠加而成。
下面结合附图,通过一些实施例及其应用场景对本申请实施例提供的一种移动性管理方法、装置、通信设备及可读存储介质进行详细地说明。
参见图4,本申请实施例提供一种移动性管理方法,具体步骤包括:步骤401和步骤402。
步骤401:第一节点进行感知测量,得到测量结果;
可选的,第一节点接收第一信号,并进行感知测量,得到测量结果,该测量结果包括感知测量量,其中感知测量也可以称为切换测量。
步骤402:所述第一节点向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
可选的,感知切换包括感知节点的切换,和/或,感知方式的切换。
可选的,所述传感器类感知包括激光雷达、毫米波雷达、视觉传感器(包括:单目视觉、双目视觉、红外传感器)、惯性测量单元、以及各种其他传感器(比如雨量计、温度计、湿度计)中的至少一项关联的感知。
在本申请的一种实施方式中,所述测量结果还包括通感类感知的测量量,所述通感类感知的测量量包括以下至少一项:
第一级测量量,所述第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I路/Q路及其运算结果中的至少一项;
第二级测量量,所述第二级测量量包括:时延、多普勒、角度、强度中的至少一项;
第三级测量量,所述第三级测量量包括:距离、速度、朝向、空间位置、加速度中的至少一项;
第四级测量量,所述第四级测量量包括:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
上述通感类感知是指:通过通信感知一体化系统发送无线电信号、并接收目标反射回波信号以执行感知业务的感知方式。
上述传感器类感知是指:通过除通信感知一体化系统以外的方式执行感知业务的感知方式,典型的执行传感器类感知的设备包括:激光雷达、毫米波雷达、视觉传感器(包括:单目视觉、双目视觉、红外传感器)、惯性测量单元(Inertial Measurement Unit,IMU)、以及各种其他传感器(雨量计、温度计、湿度计等)等。
本申请中的第一节点是支持传感器类感知的设备或者是具备执行传感器感知的能力的设备,并且,该第一节点还具备通信能力。所述的支持传感器类感知的设备可以是一种特殊形态的终端(例如,同时配置有通信模块和能够执行传感器类感知的传感器模块),也可以是能够提供传感器类感知信息的应用功能/应用服务器。对于支持传感器类感知的设备是应用功能/应用服务器,典型情况如下:
例如,在终端或者基站上配备的某个传感器,该传感器与终端或者基站分属于不同的硬件和处理域(例如,终端上的传感器通常是被应用处理器(Application Processor,AP)和操作系统或者应用程序(Application,APP)使用),因此从网络角度看就是应用服务器。
又例如,摄像头等支持传感器类感知的设备执行感知得到的测量量(例如,图像)通过有线传输网络固定上报给部署在某个位置的应用服务器,所述的应用服务器可对所述的测量量进行处理,并连接有一个或多个同类型的摄像头设备。
再例如,通过Wi-Fi/蓝牙或者私有通信协议与5G/6G通信终端进行感知测量量的数据传输,则数据传输也是终结在终端的某个应用上。
对于支持传感器类感知的设备是应用功能/应用服务器的情况,由感知功能网元通过网络开放功能(Network Exposure Function,NEF)向应用功能/应用服务器请求感知信息,例如,感知信息包括感知测量量/感知结果、感知执行的时间信息等。在此之前,应用功能/应用服务器向网络功能注册可提供的感知信息和与执行感知的设备相关的信息(例如,位置、天线朝向等)。
在本申请的一种实施方式中,所述方法还包括:
所述第一节点接收切换命令,所述切换命令用于指示所述第一节点作为目标感知节点执行感知;
所述第一节点根据所述切换命令执行感知。
其中,所述目标感知节点的感知方式与源感知节点的感知方式相同或不同,所述感知方式包括:传感器类感知和通感类感知中的至少一种,所述通感类感知包括下行感知、上行感知、终端回波感知、基站回波感知、基站间空口感知、终端间空口感知中的至少一种。
在本申请的一种实施方式中,所述方法还包括:
所述第一节点接收所述第二节点发送的测量配置信息,所述测量配置信息用于配置所述第一节点进行感知测量。
在本申请的一种实施方式中,所述测量配置信息包括以下至少一项:
(1)传感器标识;
(2)传感器类型;
(3)传感器感知的测量量;
(4)上报的测量量;
(5)上报的测量量的说明信息;
(6)感知条件;
(7)感知目标或感知区域先验信息;
(8)测量报告配置;
(9)测量事件,所述测量事件包括以下至少一项:感知测量量或感知测量量的运算结果或感知性能满足预设条件;感知目标的状态发生变化;参与感知的感知节点位置发生变化。
在本申请的一种实施方式中,所述传感器感知的测量量包括以下至少一项:
(1)激光雷达相关的测量量;
(2)视觉相关的测量量;
(3)雷达相关的测量量;
(4)惯性测量单元相关的测量量;
(5)其他测量量,所述其他测量量包括以下至少一项:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分。
在本申请的一种实施方式中,所述激光雷达相关的测量量包括以下至少一项:激光雷达点云数据、根据激光雷达点云数据得到的目标的角度和/或距离、从激光雷达点云数据中识别出的目标的视觉特征、从激光雷达点云数据中识别出的目标数量;
和/或,
所述视觉相关的测量量包括以下至少一项:视觉图像、图像像素的光度、图像像素的红绿蓝(Red/Green/Blue,RGB)值、从图像中识别出的目标的视觉特征、从图像中识别出的目标的角度和/或距离、从图像中识别出的目标的数量;
和/或,
所述雷达相关的测量量包括以下至少一项:雷达点云、识别出的目标的距离、速度、和/或角度、雷达成像、目标的数量;
和/或,
所述惯性测量单元相关的测量量包括以下至少一项:加速度、速度、角速度。
在本申请的一种实施方式中,所述方法还包括:
所述第一节点向所述第二节点发送第一信息,所述第一信息包括以下至少一项:感知节点信息;感知节点的感知能力信息;感知的权限信息。
在本申请的一种实施方式中,所述第一节点包括终端,所述第二节点包括源基站和/或感知功能网元,或者,所述第一节点包括:源基站和/或候选目标基站,所述第二节点包括感知功能网元;或者,所述第一节点包括源终端和/或候选目标终端,所述第二节点包括感知功能网元;或者,所述第一节点包括候选目标基站,所述第二节点包括源基站;
或者,第一节点包括:应用功能和/或应用服务器,所述第二节点包括源基站和/或感知功能网元。
在本申请的实施例中,第一节点进行感知测量,得到测量结果;所述第一节点向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的,实现基于传感器感知的感知移动性管理,可以确保感知业务的连续性,提升感知业务的用户体验。
参见图5,本申请实施例提供一种移动性管理方法,具体步骤包括:步骤501、步骤502、步骤503和步骤504。
步骤501:第二节点接收第一节点发送的测量报告,所述测量报告包括测量结果;
步骤502:所述第二节点向发送第一请求信息,所述第一请求信息用于请求第三节点进行感知;
其中,所述目标感知节点的感知方式与源感知节点的感知方式相同或不同;所述感知方式包括:传感器类感知和通感类感知中的至少一种,所述通感类感知包括下行感知、上行感知、终端回波感知、基站回波感知、基站间空口感知、终端间空口感知中的至少一种。
可选的,第三节点可以包括候选目标终端(User Equipment,UE)、候选目标基站、候选目标应用功能、候选目标应用服务器中的至少一项。
步骤503:第二节点接收第三节点发送的第一应答信息,所述第一应答信息用于指示所述第三节点同意执行感知;
步骤504:所述第二节点向所述第三节点发送切换命令,所述切换命令用于指示所述第三节点作为目标感知节点执行感知;
也就是,第二节点基于接收到的第一应答信息,从候选目标UE、候选目标基站、候选目标应用功能或候选目标应用服务器中选择目标UE、目标基站、目标应用功能或目标应用服务器发送切换命令。
其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
在本申请的一种实施方式中,所述测量结果还包括通感类感知的测量量,所述通感类感知的测量量包括以下至少一项:
第一级测量量,所述第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I路/Q路及其运算结果中的至少一项;
第二级测量量,所述第二级测量量包括:时延、多普勒、角度、强度中的至少一项;
第三级测量量,所述第三级测量量包括:距离、速度、朝向、空间位置、加速度中的至少一项;
第四级测量量,所述第四级测量量包括:目标是否存在、轨迹、动作、表情、生命体
征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
在本申请的一种实施方式中,所述方法还包括:
所述第二节点获取感知节点的第一信息;
其中,所述第一信息包括以下至少一项:感知节点的信息;感知节点的感知能力信息;感知的权限信息。
在本申请的一种实施方式中,所述测量报告中的测量结果包括以下之一:
(1)传感器类感知测量的测量结果;
(2)通感类感知测量的测量结果;
(3)融合的测量结果,所述融合的测量结果是根据传感器类感知测量的测量结果和通感类感知测量的测量结果融合处理得到的。
在本申请的一种实施方式中,所述方法还包括:
所述第二节点向所述第一节点发送第一测量配置信息,所述第一测量配置信息用于配置所述第一节点进行传感器类感知的测量。
在本申请的一种实施方式中,所述第一测量配置信息包括以下至少一项:
(1)传感器标识;
(2)传感器类型;
(3)传感器感知的测量量;
(4)上报的测量量;
(5)上报的测量量的说明信息;
(6)感知条件;
(7)感知目标或感知区域先验信息;
(8)测量报告配置;
(9)测量事件,所述测量事件包括以下至少一项:感知测量量或感知测量量的运算结果或感知性能满足预设条件;感知目标的状态发生变化;参与感知的感知节点位置发生变化。
在本申请的一种实施方式中,所述传感器感知的测量量包括以下至少一项:
(1)激光雷达相关的测量量;
(2)视觉相关的测量量;
(3)雷达相关的测量量;
(4)惯性测量单元相关的测量量;
(5)其他测量量,所述其他测量量包括以下至少一项:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分。
在本申请的一种实施方式中,所述激光雷达相关的测量量包括以下至少一项:激光雷达点云数据、根据激光雷达点云数据得到的目标的角度和/或距离、从激光雷达点云数据中识别出的目标的视觉特征、从激光雷达点云数据中识别出的目标数量;
和/或,
所述视觉相关的测量量包括以下至少一项:视觉图像、图像像素的光度、图像像素的RGB值、从图像中识别出的目标的视觉特征、从图像中识别出的目标的角度和/或距离、从图像中识别出的目标的数量;
和/或,
所述雷达相关的测量量包括以下至少一项:雷达点云、识别出的目标的距离、速度、和/或角度、雷达成像、目标的数量;
和/或,
所述惯性测量单元相关的测量量包括以下至少一项:加速度、速度、角速度。
在本申请的一种实施方式中,所述第一节点包括终端,所述第二节点包括源基站和/或感知功能网元,或者,所述第一节点包括:源基站和/或候选目标基站,所述第二节点包括感知功能网元;或者,所述第一节点包括源终端和/或候选目标终端,所述第二节点包括感知功能网元;或者,所述第一节点包括候选目标基站,所述第二节点包括源基站;或者,第一节点包括:应用功能和/或应用服务器,所述第二节点包括源基站和/或感知功能网元。
在本申请实施例中,第二节点接收第一节点发送的测量报告,所述测量报告包括测量结果,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的;所述第二节点向第三节点发送第一请求信息,所述第一请求信息用于请求所述第三节点进行感知,实现基于传感器感知的感知移动性管理,可以保持感知业务的连续性,提升感知业务的用户体验。
为了便于理解,下面结合实施例一和实施例二介绍本申请的实施方式。
实施例一:基于传感器感知的感知移动性管理(感知切换)。
感知切换包括:参与感知业务例如轨迹追踪或者持续对目标进行测速的感知节点(基站或UE)可能的切换,以及感知方式(参见图1)可能的切换。切换前基站或UE已经在执行感知。将切换前的执行感知的基站称为源基站,将切换后执行感知的基站称为目标基站;或者将切换前的执行感知的UE称为源UE,将切换后执行感知的UE称为目标UE;
感知切换的几个例子如下:
a)源基站和UE进行下行感知,切换为目标基站和UE进行下行感知,参见图6;
b)源基站和源UE进行上行感知,切换为目标基站和目标UE进行上行感知,参见图7;
c)基站A进行自发自收感知,切换为基站B进行传感器感知;
d)UE进行上行感知,切换为UE进行传感器感知。
参见图8,具体步骤如下:
步骤1:基站和/或UE进行感知测量,感知测量包括传感器类感知的测量;
(A)传感器类感知的测量:
方式1:源基站向UE发送目标测量配置信息,UE收到目标测量配置信息后进行感知测量,并向源基站反馈测量报告;
方式1中的源基站相当于第二节点,UE相当于第一节点。
方式2:第一设备(比如,感知功能网元)向UE发送目标测量配置信息,UE收到目标测量配置信息后进行感知测量,并向第一设备反馈测量报告;可选地,UE或第一设备向源基站发送测量报告。
方式2中的第一设备相当于第二节点,UE相当于第一节点。
方式3:第一设备向源基站和/或目标基站发送目标测量配置信息,源基站和/或目标基站收到目标测量配置信息后进行感知测量,并向第一设备或源基站反馈切换测量报告;
方式3中的第一设备相当于第二节点,源基站和/或目标基站相当于第一节点。
其中,目标测量配置信息包括以下至少一项:
(1)使用的传感器标识(ID)或类型:例如是视觉摄像头,激光雷达或者毫米波雷达等传感器的哪一个或哪一类;
(2)上报的测量量;
可选的,上报的测量量包括传感器感知的测量量;
可选的,传感器感知的测量量,包括以下至少一项:
(a)激光雷达相关的测量量;
可选的,激光雷达相关的测量量,包括以下至少一项:
激光雷达点云数据,激光雷达点云数据中的每个点包括:X/Y/Z位置信息,和,附加信息;
根据激光雷达点云数据得到的目标的角度、距离;
从激光雷达点云数据中识别出的目标的视觉特征,例如:人、车辆等;
从激光雷达点云数据中识别出的目标数量。
可选的,激光雷达点云数据中的附加信息包括以下至少一项:
强度:生成激光雷达点的激光脉冲的回波强度;
回波数:回波数是某个给定脉冲的回波总数;
点分类:每个经过后处理的激光雷达点可拥有定义反射激光雷达脉冲的对象的类型的分类,可将激光雷达点分成很多个类别,例如:地面、裸露地表、树冠层顶部和水域等;
红绿蓝(Red Green and Blue,RGB):可以将RGB波段作为激光雷达数据的属性,此属性通常来自在激光雷达测量时采集的影响。
全球定位系统(Global Positioning System,GPS)时间:从飞机发射激光点的GPS时间戳。
扫描角度:
扫描方向:激光扫描镜的行进方向,值1代表正扫描方向、值0代表负扫描方向。
(b)视觉相关的测量量;
可选的,视觉相关的测量量,包括以下至少一项:
视觉图像;
图像像素的光度;
图像像素的RGB值;
从图像中识别出的目标的视觉特征,例如:人、车辆等
从图像中识别出的目标的角度、距离(特别是对于双目视觉);
从图像中识别出的目标的数量。
雷达相关的测量量,包括以下至少一项:
雷达点云,点云中的每个点包括:距离/速度/方位角/俯仰角中至少一个,或者,X/Y/Z/速度中的至少一个;
识别出的目标的距离、速度、角度;
雷达成像;
目标的数量。
(c)惯性测量单元相关的测量量;
可选的,惯性测量单元相关的测量量,包括以下至少一项:
加速度:X/Y/Z三个方向至少之一;
速度:X/Y/Z三个方向至少之一;
角速度:绕X/Y/Z三个轴至少之一。
(d)其他测量量,包括以下至少一项:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分等。
(3)上报的测量量的说明信息;
例如,摄像头上报的图像的尺寸、分辨率,激光雷达上报的点云数据中距离/速度/角度的分辨率和精度要求等等;
(4)感知条件;
可选的,感知条件包括感知开始时间、感知结束时间、感知持续时间等至少一项;
(5)感知目标或感知区域先验信息;
可选的,感知目标或感知区域先验信息包括感知目标类型、感知目标所在大致位置/区域、感知目标历史状态(速度、角度、距离、加速度、空间朝向)等至少一项;
(6)测量报告配置;
测量报告配置包括以下至少一项:上报的原则,可以是周期性上报或者事件触发上报;用于测量的参考信号的类型等;测量报告格式,例如上报的小区最大数量和波束数量等;
(7)测量事件及相关的参数;
可选的,测量事件及相关的参数包括以下至少一项:
感知测量量或感知测量量的运算结果或感知性能满足预设条件;
感知目标的状态发生变化(状态包括位置、速度等);
参与感知的UE位置发生变化。
可选的,感知测量还包括通感类感知的测量。
可选的,通感类感知的测量包括以下至少一种:
(B)下行感知(基站发第一信号UE接收第一信号)的测量:
方式1:源基站向至少一个UE发送第一测量配置信息,UE收到测量配置信息后进行感知测量,并向源基站反馈测量报告;
方式1中UE是第一节点,源基站是第二节点。
方式2:第一设备(比如感知功能网元)向至少一个UE发送第一测量配置信息,UE收到第一测量配置信息后进行感知测量,并向第一设备反馈测量报告;可选地,至少一个UE或第一设备向源基站发送测量报告。
方式1中UE是第一节点,第一设备是第二节点。
其中,第一测量配置信息包括以下至少一项:
(1)测量对象;
例如UE需要测量的源基站和候选目标基站发送的一个或多个第一信号的参数信息和资源信息等;
可选的,第一信号的参数信息包括以下至少一项:
a)波形,例如正交频分复用(Orthogonal Frequency Division Multiplexing,OFDM),单载波频分多址(Single-carrier Frequency-Division Multiple Access,SC-FDMA),正交时频空调制,调频连续波,脉冲信号等;
b)子载波间隔:例如,OFDM系统的子载波间隔30KHz;
c)保护间隔:从信号结束发送时刻到该信号的最迟回波信号被接收的时刻之间的时间间隔;该参数正比于最大感知距离;例如,可以通过2dmax/c计算得到,dmax是最大感知距离(属于感知需求),例如对于自发自收的第一信号,dmax代表第一信号收发点到信号发射点的最大距离;在某些情况下,OFDM信号循环前缀可以起到最小保护间隔的作用;c是光速;
d)带宽:该参数反比于距离分辨率,可以通过c/2/delta_d得到,其中delta_d是距离分辨率(属于感知需求);
e)burst持续时间:该参数反比于速率分辨率(属于感知需求),该参数是第一信号的时间跨度,主要为了计算多普勒频偏;该参数可通过c/2/delta_v/fc计算得到;其中,delta_v是速度分辨率;fc是信号载频或者信号的中心频点;
f)时域间隔:该参数可通过c/2/fc/v_range计算得到;其中,v_range是最大速率减去最小速度(属于感知需求);该参数是相邻的两个第一信号之间的时间间隔;
g)发送信号的功率信息包括发射功率、峰值功率、平均功率、总功率,功率谱密度,EIRP,每端口的功率等,例如发射功率从-20dBm到23dBm每隔2dBm取一个值;
h)信号格式,例如是SRS,DMRS,PRS等,或者其他预定义的信号,以及相关的序列格式(序列格式与序列内容或序列长度等相关联)等信息;
i)信号方向;例如第一信号的方向或者波束信息;
j)波束信息或者QCL关系,例如第一信号包括多个资源,每个资源与一个SSB QCL,QCL包括Type A,B,C或者D;
k)天线配置参数(适用于多天线设备对第一信号的收发),例如:发射天线正交方式(TDM/CDM/FDM/DDM等),天线端口数,天线单元数,天线单元之间的距离,接收通道数,发射通道数,发射天线数,(最大)上行或下行MIMO层数的至少一项。
可选的,第一信号的资源信息包括以下至少一项:
a)时间资源,例如第一信号所在的时隙索引或者时隙的符号索引;其中,时间资源分为两种,一种是一次性的时间资源,例如一个符号发送一个全向的第一信号/第一信号;一种是非一次性的时间资源,例如多组周期性的时间资源或者不连续的时间资源(可包含开始时间和结束时间),每一组周期性的时间资源发送同一方向的第一信号,不同组的周期性时间资源上的波束方向不同;
b)频率资源,包括第一信号的中心频点,带宽,RB或者子载波等。
(2)测量报告配置;
可选的,测量报告配置包括以下至少一项:上报的原则,可以是周期性上报或者事件触发上报;用于测量的参考信号的类型等;测量报告格式,例如上报的小区最大数量和波束数量等;
(3)测量ID;
测量ID用来关联测量对象和测量报告配置;
(4)测量事件及相关的参数;
可选的,测量事件及相关的参数,包括以下至少一项:
感知测量量或感知测量量的运算结果或感知性能满足预设条件;
感知目标的状态发生变化(状态包括位置、速度等);
参与感知的UE位置发生变化;
传统切换的事件;
UE接收的服务小区和/或邻区的一个或多个第一信号的通信相关指标满足预设条件;其中,通信相关指标包括RSRP,SINR,RSRQ,RSSI等至少一项;例如,候选目标小区的通信相关指标在预设时间段内优于源小区的通信相关指标。
(5)上报的测量量的说明信息;
例如,摄像头上报的图像的尺寸、分辨率,激光雷达上报的点云数据中距离/速度/角度的分辨率和精度要求等等;
(6)感知条件;
可选的,感知条件包括感知开始时间、感知结束时间、感知持续时间等至少一项;
(7)感知目标或感知区域先验信息;
可选的,感知目标或感知区域先验信息包括感知目标类型、感知目标所在大致位置/区域、感知目标历史状态(速度、角度、距离、加速度、空间朝向)等至少一项。
需要说明是,UE判断是否满足测量事件,可以根据不同时间的多次测量量/指标的平均值(层1滤波和/或层3滤波),避免根据单次结果判断带来的随机性/乒乓效应;
需要说明是,多个同步信号/参考信号/第一信号可以对应多个收/发波束对(beam pair),UE可以根据一个或多个波束(beam)的测量量/指标来判断是否满足测量事件
其中,测量报告至少包括感知测量所需的感知测量量的测量结果;
可选的,切换测量所需的感知测量量可以包括当前感知业务感知测量量。
(C)上行感知(UE发第一信号基站接收第一信号)的测量;
此时,源基站和至少一个候选目标基站进行感知测量,即源基站和至少一个候选目标基站相当于第一节点。
可选地,在源基站和候选目标基站进行感知测量之前,第一设备向源基站和候选目标基站发送第二测量配置信息,或者源基站向候选目标基站发送第二测量配置信息;
可选的,源基站向至少一个UE发送第一信号的参数信息,至少一个UE根据第一信号的参数信息发送第一信号,用于源基站和候选目标基站对所述第一信号进行测量;
其中,第二测量配置信息包括以下至少一项:
(1)测量对象:例如基站需要测量的UE发送的一个或多个第一信号的参数信息和资源信息等;
(2)测量报告配置;
可选的,测量报告配置包括以下至少一项:基站向第一设备上报或候选目标基站向源基站上报的原则,可以是周期性上报或者事件触发上报;用于测量的参考信号的类型等;测量报告格式,例如上报的小区最大数量和波束数量等;
(3)测量ID;测量ID用来关联测量对象和测量报告配置;
(4)测量事件及相关的参数;
可选的,测量事件及相关的参数包括以下至少一项:
感知测量量或感知测量量的运算结果或感知性能满足预设条件;
感知目标的状态发生变化(状态包括位置、速度等);
参与感知的UE位置发生变化;
传统切换的事件;
服务小区和/或邻区接收的一个或多个第一信号的通信相关指标满足预设条件;其中,通信相关指标包括RSRP,SINR,RSRQ,RSSI等至少一项;例如,候选目标小区的通信相关指标在预设时间段内优于源小区的通信相关指标。
(5)上报的测量量的说明信息;
例如,摄像头上报的图像的尺寸、分辨率,激光雷达上报的点云数据中距离/速度/角
度的分辨率和精度要求等等;
(6)感知条件;
可选的,感知条件包括感知开始时间、感知结束时间、感知持续时间等至少一项;
(7)感知目标或感知区域先验信息;
可选的,感知目标或感知区域先验信息包括感知目标类型、感知目标所在大致位置/区域、感知目标历史状态(速度、角度、距离、加速度、空间朝向)等至少一项。
需要说明的是,源基站或候选目标基站判断是否满足测量事件,可以根据不同时间的多次测量量/指标的平均值(层1滤波和/或层3滤波),避免根据单次结果判断带来的随机性/乒乓效应;
需要说明的是,多个第一信号可以对应多个收/发beam pair,源基站或候选目标基站可以根据一个或多个beam的测量量/指标来判断是否满足测量事件。
(D)UE自发自收感知的测量。
方式1:源基站向至少一个UE发送第三测量配置信息,至少一个UE收到测量配置信息后进行感知测量,并向源基站反馈测量报告;
方式1中UE相当第一节点,源基站相当于第二节点。
方式2:第一设备(比如,感知功能网元)向至少一个UE发送第三测量配置信息,至少一个UE收到第三测量配置信息后进行感知测量,并向第一设备反馈测量报告;可选地,至少一个UE或第一设备向源基站发送测量报告。
方式2中UE相当第一节点,第一设备或源基站相当于第二节点。
(E)基站回波感知的测量。
第一设备向源基站和/或至少一个目标基站发送第四测量配置信息,源基站和/或至少一个目标基站收到测量配置信息后进行感知测量,并向第一设备或源基站反馈切换测量报告,即源基站和/或目标基站相当于第一节点,第一设备相当于第二节点,或者,目标基站相当于第一节点,第一设备或源基站相当于第二节点。
(F)基站间空口感知的测量。
第一设备向源基站和/或至少一个目标基站发送第五测量配置信息,源基站和/或至少一个目标基站收到测量配置信息后进行感知测量,并向第一设备或源基站反馈测量报告;
即,源基站和/或至少一个目标基站相当于第一节点,第一设备或源基站相当于第二节点。
(G)UE间空口感知的测量。
第一设备向源UE和/或至少一个目标UE发送第六测量配置信息,源UE和/或至少一个目标UE收到测量配置信息后进行感知测量,并向第一设备或源基站反馈测量报告;
即,源UE和/或至少一个目标UE相当于第一节点,第一设备或源基站相当于第二节点
在步骤1之前,第一设备或源基站获取感知节点(比如,源基站和/或至少一个目标
基站,以及源UE和/或至少一个目标UE)的第一信息;其中感知节点是指:具有通信感知一体化功能或者传感器感知功能节点或者设备。
可选的,所述第一信息包括以下至少一项:
(1)感知节点信息;
可选的,感知节点信息包括以下至少一项:
感知节点的ID;
感知节点的类型;
比如,具备通感类感知能力的设备、具备传感器类感知能力的设备、具备通感类感知和传感器类类感知的设备,其中具备通感类感知和传感器类类感知的设备具体还应包括传感器的类型(例如:视觉传感器、激光雷达等);
感知节点的位置;
比如,可以是感知节点在全局坐标系中的坐标,或者相对于某个参考位置的坐标,所述坐标可以是直角坐标或者极坐标;
感知节点的朝向;
比如,可以是感知节点的天线面板的朝向或者本地坐标系的朝向相对于全局坐标系的旋转角度,或者相对于某个参考坐标系的旋转角度,所述的旋转角度包括方位角、俯仰角和横滚角;
感知节点的速度;
比如,可以是感知节点在全局坐标系中的速度,或者相对于某个参考坐标系的速度,所述的速度包括速度的大小和速度的方向;对于固定位置的感知节点,此项可缺省或者简化表示。
(2)感知节点的感知能力信息;
(3)感知的权限信息:是否允许用来执行感知业务。
可选的,获取所述第一信息的方法包括以下至少之一:
访问存储有感知节点的第一信息中至少部分信息的网络节点;
向具备通信功能的感知节点发送信令请求回复的第一信息中的至少部分信息、并接收所述感知节点发送的第一信息中的至少部分信息。
需要说明的是,传感器感知和通感一体化感知的测量结果可以分别上报,也可以融合后再上报(适用于同时具有通感一体化感知和传感器类感知能力的设备)。或者,传感器感知和通感一体化感知的测量结果分别上报后再融合。
步骤2:源基站或第一设备基于步骤1的测量报告,决定是否发起切换。
即,源基站或第一设备相当于第二节点。
可选地,源基站向第一设备上报测量报告,由第一设备决定是否发起切换请求;或者,第一设备根据UE或基站发送的测量报告,决定是否发起切换请求。
若不发起切换,后续处理可以是维持或者结束当前感知。
若发起切换,第一设备或源基站决定由哪个候选目标基站和/或哪个候选目标UE进行感知,具体分为以下两种情况之一:
情况1:源基站决定切换为候选目标基站和/或候选目标UE来执行感知。
源基站向至少一个候选目标基站和/或候选目标UE发送第一请求信息,所述第一请求信息为请求所述至少一个候选目标基站和/或候选目标UE进行感知。
可选地,源基站向第一设备发送第一指示信息,所述第一指示信息为通知所述第一设备所述至少一个候选目标基站和/或候选目标UE将进行感知。
情况2:第一设备决定切换为候选目标基站和/或候选目标UE来执行感知。
第一设备向至少一个候选目标基站和/或候选目标UE发送第一请求信息。
可选地,第一设备向源基站和/或候选目标UE发送第一指示信息。
可选地,所述第一请求信息可以包括软切换请求。
可选地,候选目标基站或候选目标UE的确定基于以下信息中的至少一项:
1)基站/UE的位置信息;
2)基站/UE的天线面板朝向信息;
3)基站/UE的感知能力信息(包括基站/UE感知覆盖范围、可用于感知的最大带宽、感知业务最大可持续时间、所能支持的第一信号类型及帧格式、基站天线阵列信息(阵列类型、天线数、阵列孔径、天线极化特性、阵元增益和方向性特性等));
4)基站/UE当前可用于进行感知的资源信息(包括时间资源(符号数、时隙数、帧数等)、频率资源(资源块(Resource Block,RB)数、资源单元(Resource Element,RE)数、总带宽、可用频段位置等)、天线资源(天线/天线子阵列数)、相位调制资源(硬件移相器数)、正交码资源(正交码长度和数量)等);
5)基站/UE的信道状态信息(包括至少一个通信链路的信道传输函数/信道冲激响应、信道质量指示(Channel Quality Indicator,CQI)、预编码矩阵指示(Precoding Matrix Indicator,PMI)、CSI-RS资源指示、SSB资源指示、层指示(LI)、秩指示(RI)以及L1-RSRP等至少一项);
6)第一请求信息。
可选的,第一请求信息包括以下至少一项:
1)感知需求;
2)感知QoS;
可选的,感知QoS包括以下至少一项:感知分辨率(进一步可分为:测距分辨率、测角分辨率、测速分辨率、成像分辨率)等,感知精度(进一步可分为:测距精度、测角精度、测速精度、定位精度等),感知范围(进一步可分为:测距范围、测速范围、测角范围、成像范围等),感知时延(从第一信号发送到获得感知结果的时间间隔,或,从感知需求发起到获取感知结果的时间间隔),感知更新速率(相邻两次执行感知并获得感知结果的时间间隔),检测概率(在感知对象存在的情况下被正确检测出来的概率),虚警概
率(在感知对象不存在的情况下错误检测出感知目标的概率),感知安全性,感知隐私性);
3)感知测量量;
4)感知测量结果;
可选的,感知测量结果包括基于至少一种感知测量量直接或间接得到的感知结果;
5)感知条件;
可选的,感知条件包括感知开始时间、感知结束时间、感知持续时间等至少一项;
6)感知方式切换成功判决条件;
例如,感知方式切换成功判决条件指示至少一种感知测量量和/或通信测量量的测量结果在预设时间内/预设次数下达到预设门限;
步骤3:候选目标基站和/或候选目标UE决定是否接受切换/执行感知。分为以下两种情况:
(1)若同意,则候选目标基站和/或候选目标UE向第一请求信息发送方(源基站或第一设备)发送第一应答信息,所述第一应答信息为指示第一请求信息发送方,第一应答信息发送方同意执行感知。
可选地,候选目标基站和/或候选目标UE在第一应答信息中反馈建议的第一参数配置信息。所述第一参数配置信息,用于候选目标基站和/或候选目标UE执行感知的感知参数配置。
若第一请求信息中包括软切换请求,且候选目标基站同意并支持软切换,可选地,第一参数配置信息包括软切换参数配置信息。
(2)若不同意,则可选地,候选目标基站和/或候选目标UE向第一请求信息发送方(源基站或第一设备)发送第一拒绝信息,所述第一拒绝信息为指示第一请求信息发送方,第一拒绝信息发送方不进行感知。
后续处理可以是以下其中一项:i.源基站或第一设备重新确定候选目标基站和/或候选目标UE;ii.维持当前感知;iii.结束当前感知;
可选的,候选目标基站和/或候选目标UE根据自己的设备能力决定是否接受切换/执行感知;其中,设备能力包括感知相关设备能力,基站感知覆盖范围、可用于感知的最大带宽、感知业务最大可持续时间、所能支持的第一信号类型及帧格式、基站天线阵列信息(阵列类型、天线数、阵列孔径、天线极化特性、阵元增益和方向性特性、支持六种感知方式的哪几种等;
步骤4:源基站或第一设备基于收到的第一应答信息,在候选目标基站中确定至少一个目标基站,和/或,在候选目标UE确定至少一个目标UE,作为切换后执行感知的基站和/或UE。
源基站或第一设备向目标基站和/或目标UE发送切换命令。所述切换命令用于通知目标感知节点执行感知;
可选地,源基站或第一设备在切换命令中反馈建议的第二参数配置信息。所述第二参
数配置信息,用于目标感知节点执行感知的感知参数配置。
所述第二参数配置信息中包括的内容可以参考第一参数配置信息的描述,即。第二参数配置信息中包括的内容可以与第一参数配置信息中包括的内容相同。
可选地,第二参数配置信息包括软切换参数配置信息。
步骤5:目标基站和/或目标UE执行感知(包括传感器感知或者6种通感一体化的感知方式的至少一种)。
需要说明的是,切换前后,除了感知节点可能发生切换外,感知方式也可能发生切换。
具体地,对于基站发生切换的感知方式,后续处理分为以下2种情况:
情况一:采用软切换方法。目标基站基于第一请求信息、第一参数配置信息、第二参数配置信息中的至少一项,进行感知参数配置,执行感知(包括上行感知或下行感知)。
在获得至少一次感知测量量测量结果和/或感知结果后,目标基站向源基站或第一设备发送切换成功消息。
进一步地,包括以下几种情况之一:
1)第一请求信息发送方为源基站,源基站和目标基站不是同一设备:
源基站收到切换成功消息后,向UE发送感知结束命令。源基站和UE结束原有感知操作,释放感知所占用的资源(包括时频资源、天线端口资源等);
2)第一请求信息发送方为第一设备,源基站和目标基站不是同一设备:
第一设备收到切换成功消息后,向源基站和UE发送感知结束命令。源基站和UE结束原有感知操作,释放感知所占用的资源(包括时频资源、天线端口资源等);
情况二:采用硬切换方法。执行步骤4的同时,源基站或第一设备无需等待切换成功消息。包括以下几种情况之一:
1)第一请求信息发送方为源基站,源基站和目标基站不是同一设备:
源基站向UE发送感知结束命令。源基站和UE结束原有感知操作,释放感知所占用的资源(包括时频资源、天线端口资源等);
2)第一请求信息发送方为第一设备,源基站和目标基站不是同一设备:
第一设备向源基站和UE发送感知结束命令。源基站和UE结束原有感知操作,释放感知所占用的资源(包括时频资源、天线端口资源等);
步骤6:可选地,源基站和/或第一设备将部分或全部历史感知测量量和/或历史感知结果、感知目标/区域先验信息发送给目标基站和/或目标UE。
实施例二:传感器感知和通感一体化感知的测量结果融合后再上报。
本实施例包括以下三个场景:
场景一:通感类感知与视觉传感器融合。
视觉传感器分为单目视觉和双目视觉,单目视觉利用单个摄像头进行感知、双目视觉利用两个摄像头进行感知。视觉传感器通过对目标或目标区域进行成像得到视觉图像。
视觉传感器相比通感的优点是能够成像,并且基于视觉图像和算法能够对目标的视觉
特征(例如,人、车辆等)进行识别。视觉传感器相比于通感的缺点是不能测速,并且测距的性能较差。
视觉传感器与通感融合后,将视觉传感器得到的图像,与通感得到的目标距离、速度等,进行结合后,能够获得具有深度信息和速度信息的三维图像。例如:在通感与视觉系统进行时间配准和空间配准后,将通感得到的三维或四维点云数据(包括:距离、速度、角度信息)投影到视觉图像平面上,从而能够将距离信息和速度信息赋给对应的图像像素或图像中的目标;最后再将融合后的图像变换回到三维空间中即可得到相应的具有深度信息和速度信息的三维图像。
这里,通感类感知的测量量是速度和距离信息(或者,包含速度和距离信息的点云数据),视觉传感器感知的测量量是图像。
场景二:通感类感知与激光雷达融合。
激光雷达通过发射超窄激光束、并接收反射回波,能够获得超高的角度分辨率;同时光频段具有超高的带宽,使得激光雷达具有超高的距离分辨率。然而,现有的商用激光雷达一般不具备测速功能。
通感能够测量距离、角度和速度。通常通感节点的天线数量有限,使得测角分辨率很有限;同时通感信号的带宽也远远低于光频段信号的带宽。然后,通感能够通过信号配置获得较好的速度分辨率性能。
因此,激光雷达与通感的融合,能够很好地结合二者的优势,获得较好的测距、测角、测速性能。例如,将通感测量获得的点云数据,与,激光雷达测量获得三维点云数据,进行时间配准和空间配准后,即可将通感点云数据和激光雷达点云数据的相应的点关联起来获得融合点云数据;融合点云数据中的每个点的距离和角度信息采用激光雷达点云数据中的数值,而融合点云数据中的每个点的速度信息则采用通感点云数据中的数值;从而使得融合点云数据同时具有较好的测角、测距、测速性能。
场景三:通感类感知与毫米波雷达融合。
通感与毫米波雷达在原理上是完全相同的,通感与毫米波雷达的融合相当于多链路感知融合的情况,通过对于通感和毫米波雷达的测量结果进行加权处理或者选择SNR较高的一个,能够获得更好的测量精度。
在本实施例中,感知测量量或感知测量量的运算结果或感知性能满足预设条件,包括以下至少一项:
感知目标关联信号分量的功率值满足第一门限或者功率值最大的;例如,两个接收天线/接收通道上的感知测量量进行除或共轭乘的运算结果(或其他运算结果)对应的感知目标关联信号分量的功率值满足第一门限;
感知SNR满足第二门限或者感知SNR最大的;
感知SINR满足第三门限或者感知SINR最大的;
至少检测到Y个感知目标;
基于检测所确定的感知目标对应的比特位图与网络侧设备配置的预设比特位图一致;
感知目标的雷达截面面积RCS满足第一预设条件或者RCS最大的;例如,感知目标的雷达截面面积RCS满足第一预设条件:例如第一预设条件是RCS达到X平方米,X是一个正实数;
感知目标的谱信息满足第二预设条件;例如,感知目标的谱信息满足第二预设条件:例如感知目标的距离-速率谱满足第二预设条件,此时的第二预设条件是距离-速率谱上能分辨出感知目标(距离-速率谱有一个点或者一个区域的幅度达到预设值或者幅度最大的);或者,感知目标的时延-多普勒谱满足第二预设条件,此时的第二预设条件是时延-多普勒谱上能分辨出感知目标(时延-多普勒谱有一个点或者一个区域的幅度达到预设值或者幅度最大的);
感知目标的第一参量满足第三预设条件,所述第一参量包括以下至少一项:时延、距离、多普勒、速度、角度信息;例如,感知目标的第二参量满足第三预设条件:例如感知目标的时延满足第三预设条件(例如时延满足一个区间值);再例如,感知目标的距离满足第三预设条件(例如距离满足一个区间值);再例如,感知目标的多普勒满足第三预设条件(例如多普勒满足一个区间值);再例如,感知目标的速度满足第三预设条件(例如速度满足一个区间值);再例如,感知目标的角度信息满足第三预设条件(例如角度信息满足一个区间值);
其中,Y为正整数。
上述感知测量量或感知测量量的运算结果或感知性能包括以下至少一项:
A101、感知目标关联信号分量的功率值;
例如,可以为感知径的功率值。
需要说明的是,所述感知目标关联信号分量的功率值为接收的第一信号中受感知目标影响较大的信号分量功率,可以是以下至少一项:
A1011、以接收的第一信号的频域信道响应中幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值;或以某一个指定子载波或物理资源块(Physical Resource Block,PRB)对应的样值点的幅度为目标幅度计算得到的功率值,或以多个指定子载波或PRB对应的样值点的幅度为目标幅度计算得到的功率值。
A1012、以接收的第一信号的频域信道响应的逆傅里叶变换(IFFT)结果(时延域)中幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值;
或者以特定时延范围内幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值。
A1013、以接收的第一信号的时域信道响应的傅里叶变换(FFT)结果(多普勒域)
中幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值;
或者以特定多普勒范围内幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值。
A1014、以接收的第一信号的信道响应的二维傅里叶变换结果,即时延-多普勒域结果中幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值;
或者以特定时延-多普勒范围内幅度最大的样值点对应的幅度为目标幅度计算得到的功率值,或以幅度最大的多个样值点对应的幅度为目标幅度计算得到的功率值。
需要说明的是,所述幅度最大也可以是幅度超过特定门限值,所述特定门限值可以是网络侧设备指示的,也可以是终端根据噪声和/或干扰功率计算得到的。
所述特定时延/多普勒范围与感知需求相关,可以是网络侧设备指示的,也可以是终端根据感知需求得到的。
以雷达检测为例,所述感知目标关联信号分量的功率值为回波功率,回波信号功率的获取方法,可以是以下选项中的至少一项:
B11、基于回波信号快时间维FFT处理得到的时延一维图进行恒虚警检测(CFAR),以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度,如图9所示;
B12、基于回波信号慢时间维FFT处理得到的多普勒一维图进行CFAR,以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度,同图8所示;
B13、基于回波信号2D-FFT处理得到的时延-多普勒二维图进CFAR,以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度;
B14、基于回波信号3D-FFT处理得到的时延-多普勒-角度三维图进行CFAR,以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度;
需要说明的是,目标信号幅度的确定方法除以上的以CFAR过门限的幅度最大样值点为目标样值点以外,还可以是,以CFAR过门限的幅度最大样值点及其最邻近的若干个过门限样值点的均值作为目标信号幅度。
A102、感知信噪比(SNR);
例如,该感知SNR可以是感知目标关联信号分量的功率值与噪声功率的比值。
A103、第一信号与干扰加噪声比(SINR);
例如,该感知SINR可以是感知目标关联信号分量的功率值与噪声和干扰的功率之和的比值。
具体地,所述SNR/SINR的获取方法可以是:
B21、基于回波信号快时间维FFT处理得到的时延一维图进行恒虚警检测(CFAR),以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度,以一维图中距离目标样值点位置±ε个样值点以外的所有样值点为干扰/噪声样值点、并统计其平均干
扰/幅度为干扰/噪声信号幅度,最后以目标信号幅度和干扰/噪声信号幅度计算SNR/SINR;
B22、基于回波信号慢时间维FFT处理得到的多普勒一维图进行CFAR,以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度,以一维图中距离目标样值点位置±η个样值点以外的所有样值点为干扰/噪声样值点、并统计其平均幅度为干扰/噪声信号幅度,最后以目标信号幅度和干扰/噪声信号幅度计算SNR/SINR;
B23、基于回波信号2D-FFT处理得到的时延-多普勒二维图进CFAR,以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度,以二维图中距离目标样值点±ε(快时间维)和±η(慢时间维)个样值点以外的所有样值点为干扰/噪声样值点、并统计其平均幅度为干扰/噪声信号幅度,最后以目标信号幅度和干扰/噪声信号幅度计算SNR/SINR;
B24、基于回波信号3D-FFT处理得到的时延-多普勒-角度三维图进行CFAR,以CFAR过门限的幅度最大样值点为目标样值点、以其幅度为目标信号幅度,以三维图中距离目标样值点±ε(快时间维)、±η(慢时间维)和±δ(角度维)个样值点以外的所有样值点为干扰/噪声样值点、并统计其平均幅度为干扰/噪声信号幅度,最后以目标信号幅度和干扰/噪声信号幅度计算SNR/SINR;
需要说明的是,目标信号幅度的确定方式除以上的以CFAR过门限的幅度最大样值点为目标样值点以外,还可以是,以CFAR过门限的幅度最大样值点及其最邻近的若干个过门限样值点的均值作为目标信号幅度;
需要说明的是,干扰/噪声样值点的确定方式还可以是根据上述确定的干扰/噪声样值点进一步筛选,筛选方式是:对于时延一维图,去除时延为0附近的若干个样值点,以剩下的干扰/噪声样值点作为噪声样值点;对于多普勒一维图,去除多普勒为0附近的若干个样值点,以剩下的干扰/噪声样值点为干扰/噪声样值点;对于时延-多普勒二维图,去除以时延为0附近若干个点、全部多普勒范围构成的条状范围的干扰/噪声样值点,以剩下的噪声样值点作为干扰/噪声样值点;对于时延-多普勒-角度三维图,去除以时间维0附件若干个点、全部多普勒范围和全部角度范围构成的切片状范围的干扰/噪声样值点,以剩下的干扰/噪声样值点作为干扰/噪声样值点。
A104、感知目标是否存在;
可以包括以下至少一项:
是否存在速度或多普勒预设范围内的感知目标;
是否存在距离或时延预设范围内的感知目标。
A105、感知目标存在的目标个数;
可以包括以下至少一项:
存在速度或多普勒预设范围内的感知目标的目标个数;
存在距离或时延预设范围内的感知目标的目标个数。
需要说明的是,上述的A104和A105可以是根据感知需求由其他设备(例如,其他
终端,接入网设备或核心网设备)通知给终端的。
需要说明的是,判断是否有感知目标存在的方式可以是:例如,时延/多普勒一维或二维图中是否存在幅度超过特定门限值的样值点,若存在则认为检测到感知目标;时延/多普勒一维或二维图中幅度超过特定门限值的样值点的个数认为是感知目标的个数。
A106、感知目标的雷达截面面积(RCS)信息;
需要说明的是,该RCS信息可以是单个感知目标的RCS信息,也可以是多个感知目标的RCS信息。
A107、感知目标的谱信息;
需要说明的是,该谱信息可以包括以下至少一项:时延功率谱、多普勒功率谱、时延/距离-多普勒/速度谱、角度功率谱、时延/距离-角度谱、多普勒/速度-角度谱、时延/距离-多普勒/速度-角度谱。
A108、至少一个感知目标的时延;
A109、至少一个感知目标的距离;
A110、至少一个感知目标的多普勒;
A111、至少一个感知目标的速度;
A112、至少一个感知目标的角度信息。
在本实施例中,感知能力信息用于表示感知节点为了支持相应的感知业务所具备的硬件和软件的能力。对于任意的感知节点,所述的感知能力信息包括以下至少一项:
(1)是否支持通感类感知,以及在支持通感类感知的情况下的通感的感知能力信息;
(2)是否支持传感器类感知,以及在支持传感器类感知的情况下,包括以下至少一项:支持的传感器类型;对应各个支持的传感器类型的传感器的数量;对应各个支持的传感器的感知能力信息。
其中,通感的感知能力信息,见参考文献:ZL 202210016539.0。
其中,传感器的感知能力信息,包括以下至少一项:
(1)支持的感知业务类型,包括以下至少一项:
是否支持雷达探测业务,进一步包括:雷达测速、雷达测距、雷达测角、雷达成像;
是否支持用户定位和追踪业务;
是否支持三维重构业务,进一步包括:地形地貌重构、建筑物表面重构;
是否支持天气和/或空气质量检测业务,进一步包括:降雨检测、湿度检测、颗粒物(PM2.5/PM10)检测、降雪检测;
是否支持人流/车流检测业务;
是否支持健康监测业务,进一步包括:心跳监测、呼吸检测;
是否支持动作识别业务,进一步包括:手势识别、姿态识别、入侵检测。
支持的测量量。
(2)支持的测量量的QoS,对于任意一个支持的测量量,测量量的QoS包括以下至
少一项:
感知分辨率,包括以下至少之一:测距(或时延)分辨率、测速(或多普勒)分辨率、测角(方位角、俯仰角)分辨率、成像分辨率、加速度(X/Y/Z三个方向)分辨率、角速度(绕X/Y/Z三个轴)分辨率;
感知精度(误差),包括以下至少之一:测距(或时延)精度、测速(或多普勒)精度、测角(方位角、俯仰角)精度、加速度(X/Y/Z三个方向)精度、角速度(绕X/Y/Z三个轴)精度;
感知范围,包括以下至少之一:距离(或时延)测量范围、速度(或多普勒)测量范围、加速度(X/Y/Z三个方向)测量范围、角速度(绕X/Y/Z三个轴)测量范围、成像范围;
感知时延(从第一信号发送到获得感知结果的时间间隔,或,从感知需求发起到获取感知结果的时间间隔);
感知更新速率(相邻两次执行感知并获得感知结果的时间间隔);
检测概率(在感知对象存在的情况下被正确检测出来的概率);
虚警概率(在感知对象不存在的情况下错误检测出感知目标的概率);
目标个数;
覆盖范围:满足上述性能要求至少一项要求的感知目标/成像区域的空间范围。
本实施例中的感知测量量可以包括以下至少一项:
a)第一级测量量(接收信号/原始信道信息),第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I路/Q路及其运算结果中的至少一项;
其中,运算包括加减乘除、矩阵加减乘、矩阵转置、三角关系运算、平方根运算和幂次运算等,以及上述运算结果的门限检测结果、最大/最小值提取结果等中的至少一项;运算还包括快速傅里叶变换(Fast Fourier Transform,FFT)/快速傅里叶逆变换(Inverse Fast Fourier Transform,IFFT)、离散傅里叶变换(Discrete Fourier Transform,DFT)/离散傅里叶逆变换(Inverse Discrete Fourier Transform,IDFT)、2D-FFT、3D-FFT、匹配滤波、自相关运算、小波变换和数字滤波等,以及上述运算结果的门限检测结果、最大/最小值提取结果等中的至少一项;
b)第二级测量量(基本测量量),第二级测量量可以包括:时延、多普勒、角度、强度,及其多维组合表示中的至少一项;
c)第三级测量量(基本属性/状态),第三级测量量可以包括:距离、速度、朝向、空间位置、加速度中的至少一项;
d)第四级测量量(进阶属性/状态),第四级测量量可以包括:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
可选的,上述感知测量量还包括感知测量量对应的标签信息,标签信息可以包括以下
至少一项:
(1)第一信号的标识信息;
(2)感知测量配置标识信息;
(3)感知业务信息,例如,感知业务标识(ID)等;
(4)数据订阅ID;
(5)测量量用途,例如,通信、感知、通感等;
(6)时间信息;
(7)感知节点信息,例如,终端ID、节点位置、设备朝向等;
(8)感知链路信息,例如,感知链路序号、收发节点标识等;
可选的,感知链路信息包括:接收天线或接收通道的标识,如果是单个接收天线或接收通道的感知测量量,该标识是该接收天线或接收通道的标识;如果是两个接收天线或接收通道的除或共轭乘的结果,该标识是该两个接收天线或接收通道的标识,以及除或共轭乘的的标识
(9)测量量说明信息;
例如,测量量的形式,例如,幅度值、相位值、幅度和相位结合的复数值;测量量的资源类型,例如时域测量结果、频域资源测量结果;
(10)测量量指标信息,例如,SNR、感知SNR。
本实施例中的传统切换的事件的配置,参见表4。
表4
以A3事件为例,进入条件和离开条件的各参数含义如下:
Mn:邻区测量结果,不考虑任何偏移;
Ofn:邻区测量对象特定偏移量;
Ocn:邻区小区级特定偏移量;
Mp:SpCell(主服务小区)测量结果,不考虑任何偏移;
Ofp:SpCell测量对象特定偏移量;
Ocp:SpCell小区级特定偏移量;
Hys:事件的滞后参数;
Off:事件的偏移参数。
为了避免乒乓切换,基站CondTriggerConfig中针对每一事件配置timeToTrigger参数,当一个或多个候选小区在timeToTrigger时间内的L3滤波信号质量都满足事件的进入条件时,UE将满足条件的小区作为触发小区,在触发小区选择一个执行条件重配。
参见图10,本申请实施例提供一种移动性管理装置,应用于第一节点,装置1000包括:
测量模块1001,用于第一节点进行感知测量,得到测量结果;
第一发送模块1002,用于向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
在本申请的一种实施方式中,所述测量结果还包括通感类感知的测量量,
所述通感类感知的测量量包括以下至少一项:
第一级测量量,所述第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I
路/Q路及其运算结果中的至少一项;
第二级测量量,所述第二级测量量包括:时延、多普勒、角度、强度中的至少一项;
第三级测量量,所述第三级测量量包括:距离、速度、朝向、空间位置、加速度中的至少一项;
第四级测量量,所述第四级测量量包括:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
在本申请的一种实施方式中,所述装置还包括:
第一接收模块,用于接收切换命令,所述切换命令用于指示所述第一节点作为目标感知节点执行感知;
感知模块,用于根据所述切换命令执行感知。
其中,所述目标感知节点的感知方式与源感知节点的感知方式相同或不同,所述感知方式包括:传感器类感知和通感类感知中的至少一种,所述通感类感知包括下行感知、上行感知、终端回波感知、基站回波感知、基站间空口感知、终端间空口感知中的至少一种。
在本申请的一种实施方式中,所述装置还包括:
第二接收模块,用于接收所述第二节点发送的测量配置信息,所述测量配置信息用于配置所述第一节点进行感知测量。
在本申请的一种实施方式中,所述测量配置信息包括以下至少一项:
(1)传感器标识;
(2)传感器类型;
(3)传感器感知的测量量;
(4)上报的测量量;
(5)上报的测量量的说明信息;
(6)感知条件;
(7)感知目标或感知区域先验信息;
(8)测量报告配置;
(9)测量事件,所述测量事件包括以下至少一项:感知测量量或感知测量量的运算结果或感知性能满足预设条件;感知目标的状态发生变化;参与感知的感知节点位置发生变化。
在本申请的一种实施方式中,所述传感器感知的测量量包括以下至少一项:
(1)激光雷达相关的测量量;
(2)视觉相关的测量量;
(3)雷达相关的测量量;
(4)惯性测量单元相关的测量量;
(5)其他测量量,所述其他测量量包括以下至少一项:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分。
在本申请的一种实施方式中,所述激光雷达相关的测量量包括以下至少一项:激光雷达点云数据、根据激光雷达点云数据得到的目标的角度和/或距离、从激光雷达点云数据中识别出的目标的视觉特征、从激光雷达点云数据中识别出的目标数量;
和/或,
所述视觉相关的测量量包括以下至少一项:视觉图像、图像像素的光度、图像像素的RGB值、从图像中识别出的目标的视觉特征、从图像中识别出的目标的角度和/或距离、从图像中识别出的目标的数量;
和/或,
所述雷达相关的测量量包括以下至少一项:雷达点云、识别出的目标的距离、速度、和/或角度、雷达成像、目标的数量;
和/或,
所述惯性测量单元相关的测量量包括以下至少一项:加速度、速度、角速度。
在本申请的一种实施方式中,
第二发送模块,用于向所述第二节点发送第一信息,所述第一信息包括以下至少一项:感知节点信息;感知节点的感知能力信息;感知的权限信息。
在本申请的一种实施方式中,所述第一节点包括终端,所述第二节点包括源基站和/或感知功能网元,或者,所述第一节点包括:源基站和/或候选目标基站,所述第二节点包括感知功能网元;或者,所述第一节点包括源终端和/或候选目标终端,所述第二节点包括感知功能网元;或者,所述第一节点包括候选目标基站,所述第二节点包括源基站;或者,第一节点包括:应用功能和/或应用服务器,所述第二节点包括源基站和/或感知功能网元。
本申请实施例提供的装置能够实现图4方法实施例实现的各个过程,并达到相同的技术效果,为避免重复,这里不再赘述。
参见图11,本申请实施例提供一种移动性管理装置,应用于第二节点,装置1100包括:
第三接收模块1101,用于接收第一节点发送的测量报告,所述测量报告包括测量结果;
第三发送模块1102,用于在第二节点根据所述测量结果确定进行感知切换的情况下,所述第二节点发送第一请求信息,所述第一请求信息用于请求所述第三节点进行感知;
第三接收模块1103,用于接收第三节点发送的第一应答信息,所述第一应答信息用于指示所述第三节点同意执行感知;
第四发送模块1104,用于向所述第三节点发送切换命令,所述切换命令用于指示所述第三节点作为目标感知节点执行感知;
其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
在本申请的一种实施方式中,所述测量结果还包括通感类感知的测量量,所述通感类
感知的测量量包括以下至少一项:
第一级测量量,所述第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I路/Q路及其运算结果中的至少一项;
第二级测量量,所述第二级测量量包括:时延、多普勒、角度、强度中的至少一项;
第三级测量量,所述第三级测量量包括:距离、速度、朝向、空间位置、加速度中的至少一项;
第四级测量量,所述第四级测量量包括:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
在本申请的一种实施方式中,所述目标感知节点的感知方式与源感知节点的感知方式相同或不同;所述感知方式包括:传感器类感知和通感类感知中的至少一种,所述通感类感知包括下行感知、上行感知、终端回波感知、基站回波感知、基站间空口感知、终端间空口感知中的至少一种。
在本申请的一种实施方式中,所述装置还包括:
获取模块,用于获取感知节点的第一信息;
其中,所述第一信息包括以下至少一项:感知节点的信息;感知节点的感知能力信息;感知的权限信息。
在本申请的一种实施方式中,所述装置还包括:
第五发送模块,用于向所述第一节点发送测量配置信息,所述测量配置信息用于配置所述第一节点进行感知测量。
在本申请的一种实施方式中,所述测量配置信息包括以下至少一项:
(1)传感器标识;
(2)传感器类型;
(3)传感器感知的测量量;
(4)上报的测量量;
(5)上报的测量量的说明信息;
(6)感知条件;
(7)感知目标或感知区域先验信息;
(8)测量报告配置;
(9)测量事件,所述测量事件包括以下至少一项:感知测量量或感知测量量的运算结果或感知性能满足预设条件;感知目标的状态发生变化;参与感知的感知节点位置发生变化。
在本申请的一种实施方式中,所述传感器感知的测量量包括以下至少一项:
(1)激光雷达相关的测量量;
(2)视觉相关的测量量;
(3)雷达相关的测量量;
(4)惯性测量单元相关的测量量;
(5)其他测量量,所述其他测量量包括以下至少一项:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分。
在本申请的一种实施方式中,所述激光雷达相关的测量量包括以下至少一项:激光雷达点云数据、根据激光雷达点云数据得到的目标的角度和/或距离、从激光雷达点云数据中识别出的目标的视觉特征、从激光雷达点云数据中识别出的目标数量;
和/或,
所述视觉相关的测量量包括以下至少一项:视觉图像、图像像素的光度、图像像素的RGB值、从图像中识别出的目标的视觉特征、从图像中识别出的目标的角度和/或距离、从图像中识别出的目标的数量;
和/或,
所述雷达相关的测量量包括以下至少一项:雷达点云、识别出的目标的距离、速度、和/或角度、雷达成像、目标的数量;
和/或,
所述惯性测量单元相关的测量量包括以下至少一项:加速度、速度、角速度。
在本申请的一种实施方式中,所述第一节点包括终端,所述第二节点包括源基站和/或感知功能网元,或者,所述第一节点包括:源基站和/或候选目标基站,所述第二节点包括感知功能网元;或者,所述第一节点包括源终端和/或候选目标终端,所述第二节点包括感知功能网元;或者,所述第一节点包括候选目标基站,所述第二节点包括源基站;或者,第一节点包括:应用功能和/或应用服务器,所述第二节点包括源基站和/或感知功能网元。
本申请实施例提供的装置能够实现图5方法实施例实现的各个过程,并达到相同的技术效果,为避免重复,这里不再赘述。
本申请实施例还提供一种终端,具体地,图12为实现本申请实施例的一种终端的硬件结构示意图。
该终端1200包括但不限于:射频单元1201、网络模块1202、音频输出单元1203、输入单元1204、传感器1205、显示单元1206、用户输入单元1207、接口单元1208、存储器1209以及处理器1210等中的至少部分部件。
本领域技术人员可以理解,终端1200还可以包括给各个部件供电的电源(比如电池),电源可以通过电源管理系统与处理器1210逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。图12中示出的终端结构并不构成对终端的限定,终端可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置,在此不再赘述。
应理解的是,本申请实施例中,输入单元1204可以包括图形处理器(Graphics Processing Unit,GPU)12041和麦克风12042,图形处理器12041对在视频捕获模式或图
像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。显示单元1206可包括显示面板12061,可以采用液晶显示器、有机发光二极管等形式来配置显示面板12061。用户输入单元1207包括触控面板12071以及其他输入设备12072中的至少一种。触控面板12071,也称为触摸屏。触控面板12071可包括触摸检测装置和触摸控制器两个部分。其他输入设备12072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
本申请实施例中,射频单元1201接收来自网络侧设备的下行数据后,可以传输给处理器1210进行处理;另外,射频单元1201可以向网络侧设备发送上行数据。通常,射频单元1201包括但不限于天线、放大器、收发信机、耦合器、低噪声放大器、双工器等。
存储器1209可用于存储软件程序或指令以及各种数据。存储器1209可主要包括存储程序或指令的第一存储区和存储数据的第二存储区,其中,第一存储区可存储操作系统、至少一个功能所需的应用程序或指令(比如声音播放功能、图像播放功能等)等。此外,存储器1209可以包括易失性存储器或非易失性存储器,或者,存储器1209可以包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(Read-Only Memory,ROM)、可编程只读存储器(Programmable ROM,PROM)、可擦除可编程只读存储器(Erasable PROM,EPROM)、电可擦除可编程只读存储器(Electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(Random Access Memory,RAM),静态随机存取存储器(Static RAM,SRAM)、动态随机存取存储器(Dynamic RAM,DRAM)、同步动态随机存取存储器(Synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(Double Data Rate SDRAM,DDRSDRAM)、增强型同步动态随机存取存储器(Enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(Synch link DRAM,SLDRAM)和直接内存总线随机存取存储器(Direct Rambus RAM,DRRAM)。本申请实施例中的存储器1209包括但不限于这些和任意其它适合类型的存储器。
处理器1210可包括一个或多个处理单元;可选的,处理器1210集成应用处理器和调制解调处理器,其中,应用处理器主要处理涉及操作系统、用户界面和应用程序等的操作,调制解调处理器主要处理无线通信信号,如基带处理器。可以理解的是,上述调制解调处理器也可以不集成到处理器1210中。
本申请实施例提供的终端能够实现图4中第一节点为终端、源终端或候选目标终端时所示方法实施例的各个步骤,并达到相同的技术效果,为避免重复,这里不再赘述。
可选的,如图13所示,本申请实施例还提供一种网络侧设备1300,包括处理器1301和存储器1302,存储器1302上存储有可在所述处理器1301上运行的程序或指令,在该网络侧设备包括源基站、候选目标基站应用功能和/或应用服务器的情况下,该程序或指令被处理器1301执行时可以实现上述图4方法实施例的各个步骤,且能达到相同的技术效果;在该网络侧设备包括源基站和/或感知功能网元的情况下,该程序或指令被处理器1301执行时可以实现上述图5方法实施例的各个步骤,且能达到相同的技术效果,为避
免重复,这里不再赘述。
本申请实施例还提供一种可读存储介质,所述可读存储介质上存储有程序或指令,该程序或指令被处理器执行时实现图4或图5方法及上述各个实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
其中,所述处理器为上述实施例中所述的通信设备中的处理器。所述可读存储介质,包括计算机可读存储介质,如计算机只读存储器ROM、随机存取存储器RAM、磁碟或者光盘等。
本申请实施例另提供了一种芯片,所述芯片包括处理器和通信接口,所述通信接口和所述处理器耦合,所述处理器用于运行程序或指令,实现图4或图5所示及上述各个方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
应理解,本申请实施例提到的芯片还可以称为系统级芯片,系统芯片,芯片系统或片上系统芯片等。
本申请实施例另提供了一种计算机程序/程序产品,所述计算机程序/程序产品被存储在存储介质中,所述计算机程序/程序产品被至少一个处理器执行以实现图4或图5所示及上述各个方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本申请实施例另提供一种通信系统,所述通信系统包括终端与网络侧设备,所述终端用于执行如图4及上述各个方法实施例的各个过程,所述网络侧设备用于执行如图4或图5及上述各个方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。此外,需要指出的是,本申请实施方式中的方法和装置的范围不限按示出或讨论的顺序来执行功能,还可包括根据所涉及的功能按基本同时的方式或按相反的顺序来执行功能,例如,可以按不同于所描述的次序来执行所描述的方法,并且还可以添加、省去、或组合各种步骤。另外,参照某些示例所描述的特征可在其他示例中被组合。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端(可以是手机,计算机,服务器,空调器,或者网络侧设备等)执行本申请各个实施例所述的方法。
上面结合附图对本申请的实施例进行了描述,但是本申请并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本申请的启示下,在不脱离本申请宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本申请的保护之内。
Claims (21)
- 一种移动性管理方法,包括:第一节点进行感知测量,得到测量结果;所述第一节点向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
- 根据权利要求1所述的方法,其中,所述测量结果还包括通感类感知的测量量,所述通感类感知的测量量包括以下至少一项:第一级测量量,所述第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I路/Q路及其运算结果中的至少一项;第二级测量量,所述第二级测量量包括:时延、多普勒、角度、强度中的至少一项;第三级测量量,所述第三级测量量包括:距离、速度、朝向、空间位置、加速度中的至少一项;第四级测量量,所述第四级测量量包括:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
- 根据权利要求1或2所述的方法,所述方法还包括:所述第一节点接收切换命令,所述切换命令用于指示所述第一节点作为目标感知节点执行感知;所述第一节点根据所述切换命令执行感知。
- 根据权利要求1所述的方法,所述方法还包括:所述第一节点接收所述第二节点发送的测量配置信息,所述测量配置信息用于配置所述第一节点进行感知测量。
- 根据权利要求4所述的方法,其中,所述测量配置信息包括以下至少一项:传感器标识;传感器类型;传感器感知的测量量;上报的测量量;上报的测量量的说明信息;感知条件;感知目标或感知区域先验信息;测量报告配置;测量事件。
- 根据权利要求1或5所述的方法,其中,所述传感器感知的测量量包括以下至少一项:激光雷达相关的测量量;视觉相关的测量量;雷达相关的测量量;惯性测量单元相关的测量量;其他测量量。
- 根据权利要求6所述的方法,其中,所述激光雷达相关的测量量包括以下至少一项:激光雷达点云数据、根据激光雷达点云数据得到的目标的角度和/或距离、从激光雷达点云数据中识别出的目标的视觉特征、从激光雷达点云数据中识别出的目标数量;和/或,所述视觉相关的测量量包括以下至少一项:视觉图像、图像像素的光度、图像像素的RGB值、从图像中识别出的目标的视觉特征、从图像中识别出的目标的角度和/或距离、从图像中识别出的目标的数量;和/或,所述雷达相关的测量量包括以下至少一项:雷达点云、识别出的目标的距离、速度、和/或角度、雷达成像、目标的数量;和/或,所述惯性测量单元相关的测量量包括以下至少一项:加速度、速度、角速度。
- 根据权利要求1所述的方法,所述方法还包括:所述第一节点向所述第二节点发送第一信息,所述第一信息包括以下至少一项:感知节点信息;感知节点的感知能力信息;感知的权限信息。
- 根据权利要求1所述的方法,其中,所述第一节点包括终端,所述第二节点包括源基站和/或感知功能网元,或者,所述第一节点包括:源基站和/或候选目标基站,所述第二节点包括感知功能网元;或者,所述第一节点包括源终端和/或候选目标终端,所述第二节点包括感知功能网元;或者,所述第一节点包括候选目标基站,所述第二节点包括源基站;或者,所述第一节点包括:应用功能和/或应用服务器,所述第二节点包括源基站和/或感知功能网元。
- 一种移动性管理方法,包括:第二节点接收第一节点发送的测量报告,所述测量报告包括测量结果;所述第二节点发送第一请求信息,所述第一请求信息用于请求第三节点进行感知;所述第二节点接收所述第三节点发送的第一应答信息,所述第一应答信息用于指示所述第三节点同意执行感知;所述第二节点向所述第三节点发送切换命令,所述切换命令用于指示所述第三节点作为目标感知节点执行感知;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
- 根据权利要求10所述的方法,其中,所述测量结果还包括通感类感知的测量量,所述通感类感知的测量量包括以下至少一项:第一级测量量,所述第一级测量量包括:接收信号/信道响应复数结果,幅度/相位,I路/Q路及其运算结果中的至少一项;第二级测量量,所述第二级测量量包括:时延、多普勒、角度、强度中的至少一项;第三级测量量,所述第三级测量量包括:距离、速度、朝向、空间位置、加速度中的至少一项;第四级测量量,所述第四级测量量包括:目标是否存在、轨迹、动作、表情、生命体征、数量、成像结果、天气、空气质量、形状、材质、成分中的至少一项。
- 根据权利要求10所述的方法,所述方法还包括:所述第二节点获取感知节点的第一信息;其中,所述第一信息包括以下至少一项:感知节点的信息;感知节点的感知能力信息;感知的权限信息。
- 根据权利要求10所述的方法,所述方法还包括:所述第二节点向所述第一节点发送测量配置信息,所述测量配置信息用于配置所述第一节点进行感知测量。
- 根据权利要求13所述的方法,其中,所述测量配置信息包括以下至少一项:传感器标识;传感器类型;传感器感知的测量量;上报的测量量;上报的测量量的说明信息;感知条件;感知目标或感知区域先验信息;测量报告配置;测量事件。
- 根据权利要求10或14所述的方法,其中,所述传感器感知的测量量包括以下至少一项:激光雷达相关的测量量;视觉相关的测量量;雷达相关的测量量;惯性测量单元相关的测量量;其他测量量。
- 根据权利要求15所述的方法,其中,所述激光雷达相关的测量量包括以下至少一项:激光雷达点云数据、根据激光雷达点云数据得到的目标的角度和/或距离、从激光雷达点云数据中识别出的目标的视觉特征、从激光雷达点云数据中识别出的目标数量;和/或,所述视觉相关的测量量包括以下至少一项:视觉图像、图像像素的光度、图像像素的RGB值、从图像中识别出的目标的视觉特征、从图像中识别出的目标的角度和/或距离、从图像中识别出的目标的数量;和/或,所述雷达相关的测量量包括以下至少一项:雷达点云、识别出的目标的距离、速度、和/或角度、雷达成像、目标的数量;和/或,所述惯性测量单元相关的测量量包括以下至少一项:加速度、速度、角速度。
- 根据权利要求10所述的方法,其中,所述第一节点包括终端,所述第二节点包括源基站和/或感知功能网元,或者,所述第一节点包括:源基站和/或候选目标基站,所述第二节点包括感知功能网元;或者,所述第一节点包括源终端和/或候选目标终端,所述第二节点包括感知功能网元;或者,所述第一节点包括候选目标基站,所述第二节点包括源基站;或者,所述第一节点包括:应用功能和/或应用服务器,所述第二节点包括源基站和/或感知功能网元。
- 一种移动性管理装置,包括:测量模块,用于第一节点进行感知测量,得到测量结果;第一发送模块,用于向第二节点发送测量报告,所述测量报告包括所述测量结果,所述测量结果用于确定是否进行感知切换;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
- 一种移动性管理装置,包括:第三接收模块,用于接收第一节点发送的测量报告,所述测量报告包括测量结果;第三发送模块,用于向第三节点发送第一请求信息,所述第一请求信息用于请求所述第三节点进行感知;第三接收模块,用于接收第三节点发送的第一应答信息,所述第一应答信息用于指示所述第三节点同意执行感知;第四发送模块,用于向所述第三节点发送切换命令,所述切换命令用于指示所述第三节点作为目标感知节点执行感知;其中,所述测量结果包括以下至少一项:传感器感知的测量量;融合的测量结果,所述融合的测量结果是根据传感器感知的测量量和通感类感知的测量量融合处理得到的。
- 一种通信设备,包括处理器,存储器及存储在所述存储器上并可在所述处理器上运行的程序或指令,所述程序或指令被所述处理器执行时实现如权利要求1至17中任一项所述的方法的步骤。
- 一种可读存储介质,所述可读存储介质上存储程序或指令,所述程序或指令被处理器执行时实现如权利要求1至17中任一项所述的方法的步骤。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211667944.5 | 2022-12-23 | ||
CN202211667944.5A CN118250755A (zh) | 2022-12-23 | 2022-12-23 | 移动性管理方法、装置、通信设备及可读存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024131760A1 true WO2024131760A1 (zh) | 2024-06-27 |
Family
ID=91561401
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/139736 WO2024131760A1 (zh) | 2022-12-23 | 2023-12-19 | 移动性管理方法、装置、通信设备及可读存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN118250755A (zh) |
WO (1) | WO2024131760A1 (zh) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105744540A (zh) * | 2014-12-10 | 2016-07-06 | 电信科学技术研究院 | 一种基于网络感知的服务策略配置方法及网络设备 |
US20210298046A1 (en) * | 2016-09-27 | 2021-09-23 | Zte Corporation | Data processing method, node and terminal |
CN114222363A (zh) * | 2021-12-17 | 2022-03-22 | 京信网络系统股份有限公司 | 终端位置识别方法、装置及定位系统 |
CN115118867A (zh) * | 2021-03-23 | 2022-09-27 | 成都极米科技股份有限公司 | 交互方法、装置、显示设备及存储介质 |
-
2022
- 2022-12-23 CN CN202211667944.5A patent/CN118250755A/zh active Pending
-
2023
- 2023-12-19 WO PCT/CN2023/139736 patent/WO2024131760A1/zh unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105744540A (zh) * | 2014-12-10 | 2016-07-06 | 电信科学技术研究院 | 一种基于网络感知的服务策略配置方法及网络设备 |
US20210298046A1 (en) * | 2016-09-27 | 2021-09-23 | Zte Corporation | Data processing method, node and terminal |
CN115118867A (zh) * | 2021-03-23 | 2022-09-27 | 成都极米科技股份有限公司 | 交互方法、装置、显示设备及存储介质 |
CN114222363A (zh) * | 2021-12-17 | 2022-03-22 | 京信网络系统股份有限公司 | 终端位置识别方法、装置及定位系统 |
Also Published As
Publication number | Publication date |
---|---|
CN118250755A (zh) | 2024-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20240155394A1 (en) | Sensing method and apparatus, terminal, and network device | |
EP4376469A1 (en) | Sensing signal measurement method and apparatus, network device, and terminal | |
WO2023274029A1 (zh) | 通信感知方法、装置及网络设备 | |
CN115442756A (zh) | 消息传输方法、信号发送方法、装置及通信设备 | |
US20240337726A1 (en) | Sensing method and apparatus, sensing configuration method and apparatus, and a communication device | |
WO2023231840A1 (zh) | 测量处理方法、装置、通信设备及可读存储介质 | |
WO2024114460A1 (zh) | 测量方法、装置及设备 | |
WO2023174345A1 (zh) | 感知处理方法、装置、通信设备及可读存储介质 | |
WO2024131760A1 (zh) | 移动性管理方法、装置、通信设备及可读存储介质 | |
WO2024131761A1 (zh) | 感知协作方法、装置及通信设备 | |
WO2024131758A1 (zh) | 感知融合方法、装置及通信设备 | |
WO2024131757A1 (zh) | 感知协作方法、装置及通信设备 | |
WO2024099152A1 (zh) | 信息传输方法、装置及通信设备 | |
WO2023231870A1 (zh) | 通信方法、装置、终端、网络侧设备及核心网设备 | |
WO2023231844A1 (zh) | 感知测量方法、装置、设备、终端和存储介质 | |
WO2023231867A1 (zh) | 感知方式切换方法、装置及通信设备 | |
WO2023231868A1 (zh) | 感知方式切换方法、装置、通信设备及存储介质 | |
EP4432716A1 (en) | Perception method and apparatus, and communication device | |
WO2024099153A1 (zh) | 信息传输方法、装置及通信设备 | |
WO2024208205A1 (zh) | 感知能力的上报方法、接收方法、装置、通信设备及介质 | |
WO2024120359A1 (zh) | 信息处理、传输方法及通信设备 | |
WO2023231846A1 (zh) | 感知方式切换处理方法、装置、通信设备及可读存储介质 | |
WO2024061065A1 (zh) | 前导码发送方法、终端及存储介质 | |
WO2024131691A1 (zh) | 感知处理方法、装置、通信设备及可读存储介质 | |
WO2024061066A1 (zh) | 波束恢复方法、装置及终端 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23905923 Country of ref document: EP Kind code of ref document: A1 |