8000 IPLD Support · Issue #200 · orbitdb-archive/ipfs-log · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
This repository was archived by the owner on Sep 30, 2023. It is now read-only.

IPLD Support #200

Closed
aphelionz opened this issue Dec 13, 2018 · 18 comments · Fixed by #213
Closed

IPLD Support #200

aphelionz opened this issue Dec 13, 2018 · 18 comments · Fixed by #213

Comments

@aphelionz
Copy link
Contributor

From: ipfs-inactive/dynamic-data-and-capabilities#50

Support IPLD now that js-ipfs has it via the dag.put and dag.get functions. This allows us to use the IPLD query language and use explorer.ipld.io to debug.

@aphelionz
Copy link
Contributor Author

cc @satazor

@satazor
Copy link
Contributor
satazor commented Dec 13, 2018

Also related to #106

@satazor
Copy link
Contributor
satazor commented Dec 15, 2018

@aphelionz @haadcode it would be awesome if we could upgrade to latest js-ipfs version because the dag api has changed since the version that is currently installed, see: https://github.com/ipfs/js-ipfs/blob/master/CHANGELOG.md#breaking-changes-1

Do you see any problem with upgrading first?

@aphelionz
Copy link
Contributor Author

No problem. In fact if you look at this PR I've upgraded with no issue: https://github.com/orbitdb/ipfs-log/pull/189/files#diff-b9cfc7f2cdf78a7f4b91a753d10865a2R29

@satazor
Copy link
Contributor
satazor commented Dec 15, 2018

Got the thing almost ready. It’s difficult when you need to access documentation but you don’t have internet in a plane :(. Well, one more flight to go, then I’m finally home.

@satazor
Copy link
Contributor
satazor commented Dec 17, 2018

An update: I got all test passing but I had to increate the timeout to 60sec. It seems that by using CBOR, things got a lot slower! I'm still investigating.

@satazor
Copy link
Contributor
satazor commented Dec 17, 2018

e.g.:

this is ipfs.object.get durations:

get0.8680925440123695: 0.003ms
get0.05914218311229735: 0.004ms
get0.8980442040689645: 0.003ms
get0.7928380906629751: 0.003ms
get0.29532819056810555: 0.003ms
get0.982201991275975: 0.003ms
get0.6871858848757975: 0.004ms
get0.7882878992144131: 0.002ms
get0.21936588684517888: 0.002ms
get0.08061932663511051: 0.003ms
get0.04032113690632366: 0.002ms
get0.8311022697713366: 0.003ms
get0.45769189600923266: 0.003ms
get0.19011790185669564: 0.003ms
get0.43031950638495764: 0.003ms
get0.336247208160362: 0.013ms
get0.03922621786243807: 0.003ms

vs ipfs.dag.get:

get0.033568382253189366: 125.828ms
get0.11738735943129086: 124.878ms
get0.8056916343416749: 123.779ms
get0.9619766611151739: 123.146ms
get0.8896502548868206: 122.545ms
get0.9369903801095469: 122.092ms
get0.04327821966495504: 121.405ms
get0.42971773048294604: 120.747ms
get0.24380508364928022: 118.174ms
get0.12392763385470262: 116.792ms
get0.7405273519923308: 115.561ms
get0.11536795042400794: 114.486ms
get0.04598372191645783: 112.277ms
get0.7646575609750215: 110.135ms
get0.033475819043584654: 109.260ms
get0.72246736048451: 108.503ms

This is orders of magnitude slower. //cc @alanshaw could you please advise? 🙏

@satazor
Copy link
Contributor
satazor commented Dec 17, 2018

I've switched to using dag-pb when using dag.put and dag.get and it's still slow. This means it's not an issue with dag-cbor. Still investigating.

@haadcode
Copy link
Member

These numbers have quite a huge difference. I'm thinking could it be something to do with fetching from disk (dag.get) vs. fetching from cache (object.get)?

@shamb0t shamb0t mentioned this issue Dec 19, 2018
@shamb0t
Copy link
Contributor
shamb0t commented Dec 19, 2018

Hey @satazor I looked into this today and made changes to use the ipfs.dag api. I haven't experienced this slow-down. Which version of node are you using?

@satazor
Copy link
Contributor
satazor commented Dec 19, 2018

@shamb0t I've just checkout your branch and I have the exact same slowdown. Basically the replication tests timeout because they take a lot of time. It seems that it's something related to my machine.

➜  ipfs-log git:(feat/ipfs-dag) ✗ node --version
v10.14.2
➜  ipfs-log git:(feat/ipfs-dag) ✗ npm --version
6.5.0

I will ask @vasco-santos to test on his machine.

@vasco-santos
Copy link

I have tried it out with @shamb0t PR orbitdb/ipfs-log#210, with a fresh clone and install, but the replicate tests fail due timeout 😕

image

@satazor
Copy link
Contributor
satazor commented Dec 19, 2018

So, it seems that @vasco-santos is having the same problem that I experience. Raising the timeout to 60s solves the problem but that specific test passes in under 4s when using the object api vs 40+ sec when using the dag api.

@shamb0t
Copy link
Contributor
shamb0t commented Dec 20, 2018

@satazor @vasco-santos thanks guys! Looks like this is indeed an issue, will look into it further

@haadcode
Copy link
Member

A quick clarification on:

Raising the timeout to 60s solves the problem but that specific test passes in under 4s when using the object api vs 40+ sec when using the dag api.

In the replication test, we use mem-store which is an in-memory "mock" for ipfs.object.get/put. It makes the tests run a lot faster. We use it by monkey-patching object.get/put in the test setup (before()). I believe @shamb0t fixed this by changing the setup to use mem-store for ipfs.dag.get/put. I believe the replication test again completes in 3-4secs as before.

@shamb0t
Copy link
Contributor
shamb0t commented Dec 20, 2018

Didn't realize we were monkey-patching to evade hitting disk 😅 so that explains the slow-down without it. As @haadcode mentioned this is fixed now and the replication tests should complete in the usual 3-4s

@satazor
Copy link
Contributor
satazor commented Dec 28, 2018

PR at #213
This super-seeds #210 but thanks @shamb0t for tackling this as well.

Sorry for the delay, I'm currently in vacations but I wanted to finish this! Happy xmas and new year everyone 🎄🎉

@shamb0t
Copy link
Contributor
shamb0t commented Dec 28, 2018

Happy holidays everyone and thank you @satazor for the PR! ❤️ we'll continue from there

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants
0