[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
|
|
Subscribe / Log in / New account

Automated testing for kernel performance regressions

Automated testing for kernel performance regressions

Posted Aug 4, 2012 10:52 UTC (Sat) by copsewood (subscriber, #199)
Parent article: Testing for kernel performance regressions

I've developed about 400 regression tests for my own web application because I know from upgrading between production versions with live data well enough why automated testing is needed. But the mind boggles trying to get my head around what's likely to be required in kernelspace - which must be 3 orders of magnitude more complex.

In one sense the current problem resulting from lack of a coherent test facility might be equivalent to the previous practice of trying to manage the kernel patch queue using an email spool without source code revision control. BitKeeper then Git weren't introduced that long ago and their lack must have constrained what Linus could realistically do. I suspect the test problem also inherently likely to get worse until some kind of standardisation and incentive mechanism enables a major distributed community effort to come together into a coherent test facility. Even if it can't cover all automated test requirements, if it could cover enough of these it seems likely (if it's at all feasible) greatly to improve likely quality of released software.

Those with the compute resources likely to be needed to crunch the test software (to the extent hardware emulation is possible), are not always the same as those with the incentive to write the test cases, when it comes to generic as opposed to hardware-specific kernel features. The incentive to contribute hardware emulations would be to get hardware onto a 'platinum level' support list so more purchasers buy it. The incentive to contribute test cases would be so that your tests are automatically run on time and regressions resulting from newer software are automatically reported.

Could the Linux Foundation working with other interested parties attract the resources to fund and develop a cloud type test facility ? This probably wouldn't work for drivers unless either software emulators for the hardware in question exist, or the physical hardware could be installed within the test farm using various standardised protocols allowing for timed tests, automated output comparisons and resets etc. I guess the funding of this would mainly come through hardware manufacturers who want the highest possible level of kernel support.

So if this thought experiment ever leads to feasible development, there seem likely to be 3 main contributors.

a. A vendor independant, trusted and funded community body which runs the main test rig, e.g. Linux Foundation.

b. Contributors of generic test cases for kernel features intended to run on many different types of hardware.

c. Contributors of physical hardware requiring dedicated device drivers, software emulators for that hardware and tests which can run on scaffolded or emulated hardware. (Eventually manufacturers could contractually commit to their customers to supporting physical and or emulated hardware within this facility for a stated period after it ceases to be in production.)

There seem likely to be a few reasons why this isn't feasible, but I can't think of any quite yet.


to post comments


Copyright © 2025, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds