You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I see why other visualization tools are reluctant to embrace nanoseconds, it explodes the variations of cases that need handled:
Not all databases send nanosecond with same offset 1970 vs 2000
You can't send nanos via json as number, the accuracy gets lost. Not all IPC supports bigint so you need to send something inefficient.
On the receiving end, you can't use JS dates. Which means the "table" in json for dates/times vs timestamps/timespans need to be an entirely different type.
Which means both timestamp/timespan would need custom date/time formatters parallel to the current 5-6 date formatting options.
Once the value reaches the graph, graphs don't support numbers that large with that little differences. So that would be a different rendering pipeline.
We already had a time-series with date optimized pipeline vs standard charts, so this would add a third. Both with and without nanoseconds would need to stay optimized.
Given these I'm going to pause on commitment from customers to push forward.
user-jl
The text was updated successfully, but these errors were encountered: