[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dramatic performance boost for consecutive rospack calls [backport to indigo] #49

Closed
wants to merge 11 commits into from

Conversation

rhaschke
Copy link
Contributor

Consecutive calls to the rospack library, e.g. ros::package::getPath(), from within the same process recrawled the ROS_PACKAGE_PATH all the time, because the appropriate rospack instances were created again and again.
This PR (going in line with a similar PR on roslib) improves things in two respects:

  • a static rospack::Rospack instance is used in ROSPack::run(), thus re-using the same instance for all consecutive calls
  • in Rosstackage::crawl() the cache is loaded only once at the beginning, when search_paths_ is still empty

The following simple test program (doing 3000 calls to getPath()) performs 160 times faster:

time ./foo
real    0m0.264s
user    0m0.251s
sys     0m0.012s

time ./foo
real    0m49.456s
user    0m42.500s
sys     0m6.840s
#include <ros/package.h>
int main(int argc, char *argv[]) {
    for (int i=0; i<3000; ++i) {
        ros::package::getPath("roscpp");
    }
}

@rhaschke
Copy link
Contributor Author

This goes along with roslib PR #97.

@@ -331,16 +331,13 @@ Rosstackage::crawl(std::vector<std::string> search_path,
{
if(!force)
{
if(readCache())
// read the cache only once (when search_paths_ is still empty)
if(search_paths_.empty() && readCache())
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do you want to prevent using the existing cache here? This will result in worse performance when this function is called multiple times.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed, if the current search_path has changed, we should read the corresponding cache. I adapted the code, such that there are the following code paths in Rosstackage::crawl() now:

  • crawling is enforced -> just (re)crawl
  • current search_path matches previous one and rospack already crawled: nothing todo [new]
  • current search_path diverges from previous one: read corresponding cache
    • on success: nothing else todo
    • on failure (no cache or too old): re-crawl
  • on re-crawling, set the search_paths_ appropriately

@rhaschke
Copy link
Contributor Author

@dirk-thomas Did you already have seen my answers to your remarks?

return false;
}
return true;
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How is this function different from the comparison operator of a vector (http://en.cppreference.com/w/cpp/container/vector/operator_cmp)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Valid point. I didn't checked for availability of vector::operator==. Simply moved the existing verbose code into its own function. Now, the function is replaced by std::vector::operator==.

@gerkey
Copy link
Contributor
gerkey commented Dec 9, 2015

I've started working on this in two branches:

https://github.com/ros/ros/tree/reuse_cache
https://github.com/ros/rospack/tree/reuse_cache

I applied @rhaschke's changes and added a test to rospack to test the effect of in-process changes to ROS_PACKAGE_PATH: https://github.com/ros/rospack/blob/reuse_cache/test/test/utest.cpp#L129

Sadly, that test passes on the first run, but fails on the second run and all subsequent runs. I'm still looking into it.

@rhaschke
Copy link
Contributor Author
rhaschke commented Dec 9, 2015

Hm. I tested that only manually. I will have a look into your branch too.

@rhaschke
Copy link
Contributor Author
rhaschke commented Dec 9, 2015

@gerkey Created pull request #52 to fix the unit test.

@rhaschke
Copy link
Contributor Author
rhaschke commented Dec 9, 2015

I had some issues to compile the unit test. Using my catkin tools environment, the binary rospack-utest wasn't created by default. I had to manually make rospack-utest. Is that normal behavior?

@gerkey
Copy link
Contributor
gerkey commented Dec 9, 2015

Thanks for fixes in #52.

But I still have failures elsewhere after the initial run, apparently related to whether the cache is available:

gerkey@rossum:~/code/rospack-caching/build/rospack/test$ rm -rf ~/.ros/rospack_cache_*
gerkey@rossum:~/code/rospack-caching/build/rospack/test$ ../../../devel/lib/rospack/rospack-utest 
[==========] Running 4 tests from 1 test case.
[----------] Global test environment set-up.
[----------] 4 tests from rospack
[ RUN      ] rospack.reentrant
[       OK ] rospack.reentrant (4 ms)
[ RUN      ] rospack.multiple_rospack_objects
[       OK ] rospack.multiple_rospack_objects (6 ms)
[ RUN      ] rospack.deduplicate_tokens
[       OK ] rospack.deduplicate_tokens (0 ms)
[ RUN      ] rospack.env_change
[       OK ] rospack.env_change (2 ms)
[----------] 4 tests from rospack (12 ms total)

[----------] Global test environment tear-down
[==========] 4 tests from 1 test case ran. (12 ms total)
[  PASSED  ] 4 tests.
gerkey@rossum:~/code/rospack-caching/build/rospack/test$ ../../../devel/lib/rospack/rospack-utest 
[==========] Running 4 tests from 1 test case.
[----------] Global test environment set-up.
[----------] 4 tests from rospack
[ RUN      ] rospack.reentrant
[       OK ] rospack.reentrant (4 ms)
[ RUN      ] rospack.multiple_rospack_objects
[       OK ] rospack.multiple_rospack_objects (5 ms)
[ RUN      ] rospack.deduplicate_tokens
[       OK ] rospack.deduplicate_tokens (0 ms)
[ RUN      ] rospack.env_change
/home/gerkey/code/rospack-caching/src/rospack/test/test/utest.cpp:188: Failure
Value of: test_pkgs.size()
  Actual: 3
Expected: output_list.size()
Which is: 4
/home/gerkey/code/rospack-caching/src/rospack/test/test/utest.cpp:204: Failure
Value of: found
  Actual: false
Expected: true
/home/gerkey/code/rospack-caching/src/rospack/test/test/utest.cpp:214: Failure
Value of: std::string()
  Actual: ""
Expected: output
Which is: "precedence1\nprecedence2\nprecedence3\nroslang"
[  FAILED  ] rospack.env_change (2 ms)
[----------] 4 tests from rospack (11 ms total)

[----------] Global test environment tear-down
[==========] 4 tests from 1 test case ran. (11 ms total)
[  PASSED  ] 3 tests.
[  FAILED  ] 1 test, listed below:
[  FAILED  ] rospack.env_change

 1 FAILED TEST

@gerkey
Copy link
Contributor
gerkey commented Dec 9, 2015

To build and run the test (assuming you have a workspace ws with ros and rospack):

cd ws
catkin_make tests
# The test is sensitive to CWD
cd build/rospack/test
../../../devel/lib/rospack/rospack-utest

You can also run all the tests via catkin_make run_tests, but you have do some scrolling to see the output of this particular test.

@gerkey
Copy link
Contributor
gerkey commented Dec 9, 2015

Here's more detail, with some instrumentation added to the test:

[ RUN      ] rospack.env_change
/home/gerkey/code/rospack-caching/src/rospack/test/test/utest.cpp:188: Failure
Value of: test_pkgs.size()
  Actual: 3
Expected: output_list.size()
Which is: 4
output_list: 
  precedence1
  precedence2
  precedence3
  roslang
test_pkgs: 
  precedence1
  precedence2
  precedence3

Looks like the result from the first run, with one setting for ROS_PACKAGE_PATH, is persisting into the second run, where that var has been changed.

@rhaschke, can you look into this and get back when it's working?

@rhaschke
Copy link
Contributor Author
rhaschke commented Dec 9, 2015

Yes, of course. I will do it.

@rhaschke
Copy link
Contributor Author
rhaschke commented Dec 9, 2015

readCache() simply add more stackages without clearing the list of stackages_ beforehand...
Fixed in #53.

@gerkey
Copy link
Contributor
gerkey commented Dec 10, 2015

Thanks for the fixes in #53. I'll open a new PR now.

@rhaschke
Copy link
Contributor Author

The cache itself was valid. However, reading the cache simply added new
stackages withouth wiping out existing stuff before.
This worked for the first run, before the internal list was empty
anyway. Additionally, the cache invalidates after 60s thus enforcing a
recrawl, which also wiped out internal storage beforehand.

@rhaschke
Copy link
Contributor Author

I rebased onto the same indigo-devel commit from where #54 started. Subsequently I rebased all additional commits in #54 onto this original PR to keep proper history (see comments here).
#54 and #49 are identical again (except for a permission change that was lost in #54 for some reason).
Hence, I suggest to close #54 and continue here again.

@rhaschke
Copy link
Contributor Author

@dirk-thomas, @gerkey
friendly ping on this PR (and its counter-part ros/ros#97)

@gerkey
Copy link
Contributor
gerkey commented Feb 1, 2016

We discussed this today and agree that it should be merged into jade, then depending on whether there are any problems, we'll consider backporting to indigo.

@dirk-thomas
Copy link
Member

@ros-pull-request-builder test this please

dirk-thomas pushed a commit that referenced this pull request Mar 1, 2016
@dirk-thomas
Copy link
Member

This patch has been merged into the newly created jade-devel branch: 2dbf9d8

If there are no regressions it will be considered for being backported into indigo-devel.

@dirk-thomas dirk-thomas changed the title dramatic performance boost for consecutive rospack calls dramatic performance boost for consecutive rospack calls [backport to indigo] Mar 1, 2016
@dirk-thomas
Copy link
Member

I have updated the issue title that only the backport is pending.

dirk-thomas pushed a commit that referenced this pull request Jun 27, 2016
@dirk-thomas
Copy link
Member

Since this has been released into Kinetic as well as Jade for a while without any regressions I cherry-picked the change to the indigo-devel branch as well: 2ce1584

It will be part of the next patch release for Indigo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants