You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Apologies if I'm misreading, but the paper mentions injecting noise with an SD of 150m to coordinates associated with augmented images, and noise with an SD of 1000m to the locations in the gps queue during training. This is to help add additional augmentation to the location coordinate as well as image.
However, I'm not entirely sure how noise injection is accomplished. Would the noise be added to the original lon-lat coordinates and that gets embedded by the location embedder? If so, how does the noise get translated from meters to gps coordinate? Or does the noise somehow get added after the location gets embedded? If so, how would that work? I'm assuming the meters get converted into lon-lat coordinates and then added to the original lon-lat coordinates, but I'm not entirely sure from the paper.
Additionally, is noise with a SD of 1000m continuously injected into all entries in the GPS queue with every additional batch? Or does the injection only happen once, when that batch is added to the queue, and that batch doesn't get injected again? For this one, I'm pretty sure it doesn't matter since the queue should be negative examples anyways so they just need to not be the correct one.
Is it possible to get some clarifications or code related to this?
Thank you for publishing this paper though, it was really interesting!
The text was updated successfully, but these errors were encountered:
`# for augmentation training
def add_gps_noise_simple_batch(gps, std_dev_m=150):
"""
Adds Gaussian noise with a specified standard deviation in meters to a batch of GPS coordinates.
Parameters:
- gps (torch.Tensor): Tensor of shape [batch_size, 2], where gps[:, 0] = lat, gps[:, 1] = lon.
- std_dev_m (float): Standard deviation of the noise in meters (default is 150 meters).
Returns:
- torch.Tensor: Noisy GPS coordinates of shape [batch_size, 2].
"""
lat = gps[:, 0]
lon = gps[:, 1]
# Convert lat/lon to degrees offsets
meters_per_deg_lat = 110574 # Approximate meters per degree latitude
# Manual conversion from degrees to radians
meters_per_deg_lon = 40075000 * torch.cos(lat * math.pi / 180.0) / 360
# Generate Gaussian noise in degrees for latitude
delta_lat = torch.normal(
mean=0.0,
std=std_dev_m / meters_per_deg_lat,
size=lat.size(),
device=gps.device
)
# Compute standard deviation for longitude with fallback
std_lon = torch.where(
meters_per_deg_lon == 0.0,
torch.tensor(1e-8, device=gps.device), # Fallback small value
std_dev_m / meters_per_deg_lon
)
# Generate Gaussian noise in degrees for longitude
delta_lon = torch.normal(
mean=torch.zeros_like(std_lon), # Tensor of zeros with the same shape as std_lon
std=std_lon
)
# Add noise to original coordinates
noisy_lat = lat + delta_lat
noisy_lon = lon + delta_lon
return torch.stack([noisy_lat, noisy_lon], dim=-1)`
Uh oh!
There was an error while loading. Please reload this page.
Dear Authors,
Apologies if I'm misreading, but the paper mentions injecting noise with an SD of 150m to coordinates associated with augmented images, and noise with an SD of 1000m to the locations in the gps queue during training. This is to help add additional augmentation to the location coordinate as well as image.
However, I'm not entirely sure how noise injection is accomplished. Would the noise be added to the original lon-lat coordinates and that gets embedded by the location embedder? If so, how does the noise get translated from meters to gps coordinate? Or does the noise somehow get added after the location gets embedded? If so, how would that work? I'm assuming the meters get converted into lon-lat coordinates and then added to the original lon-lat coordinates, but I'm not entirely sure from the paper.
Additionally, is noise with a SD of 1000m continuously injected into all entries in the GPS queue with every additional batch? Or does the injection only happen once, when that batch is added to the queue, and that batch doesn't get injected again? For this one, I'm pretty sure it doesn't matter since the queue should be negative examples anyways so they just need to not be the correct one.
Is it possible to get some clarifications or code related to this?
Thank you for publishing this paper though, it was really interesting!
The text was updated successfully, but these errors were encountered: