Testing is still a key challenge for OPNFV.
All the projects must manage their test strategy (unit, functional, security, performance)
Several specific test projects have been validated by TSC and already deal with:
- Define testcases
- Perform tests not covered by a single project
- Create tooling
- Study Performance end to end
OPNFV Test ecosystem
There are several projects dealing with integration and testing (see https://wiki.opnfv.org/).
A global overview can be described as follow:
See Testing Ecosystem details on the elaboration of the figure
We consider the projects referenced on the wiki main page:
VIM and NFVI funcionnal testing
Umbrella project for functional testing
Verification of the infrastructure compliance when running VNF applications.
Umbrella project for performance testing
Storage Perfomance testing
Data-plane performance testing
|CPerf||Controller performance testing|
Detect Bottlenecks in OPNFV solution
|QTIP||Platform Performance Benchmarking|
|Dovetail||Test OPNFV validation criteria for use of OPNFV trademarks|
All the test projects are closely connected to additional projects:
- Pharos: The Pharos project deals with the creation of a federated NFV test capability that is geographically and technically diverse and hosted by different companies in the OPNFV community. This requires developing a baseline specification for an OPNFV "compliant" test environment along with tools, processes and documentation to support integration, testing and collaborative development projects with needed infrastructure and the tooling.
- Releng: release engineering that deals with git/gerrit and jenkins management
The procedure to get support from test projects:
- Contact the testing group (weekly meeting / mail / #opnfv-testperf / ...)
- Declare your project in the test DB: http://testresults.opnfv.org/testapi/test_projects
- Declare your test cases in the DB: http://testresults.opnfv.org/testapi/test_projects/doctor/cases
- Provide your constraint (scenarios/installers) e.g. doctor => Apex only
- Provide your test success criteria* e.g. doctor final status should be PASS
- Develop the code of the tests https://git.opnfv.org/cgit/doctor/tree/tests
- Create JIRA in Functest/Yardstick/xPerf for integration
- Work with test team on integration (CI pipeline, dashboard, …)
- Troubleshoot on the different scenarios
- Document your tests
As OPNFV is dealing with lots of testing projects, it could make sense to share some definitions accross the different projects.
A test case is a script performing one or several actions on the System Under Test, collecting the results of these actions to provide a global or action per action results.
A test case must be declared in the test database:
Note feature project may obviously declare their own cases, e.g:
to get the list of the projects that declared test cases: http://testresults.opnfv.org/test/api/v1/projects
it is possible to use the swagger interface to add projects/cases: http://testresults.opnfv.org/test/swagger/spec.html (see http://artifacts.opnfv.org/functest/colorado/docs/devguide/index.html for details)
The test API is automatically documented (since Danube) http://artifacts.opnfv.org/releng/docs/testapi.html
Agreed During testing meeting 1/12: http://ircbot.wl.linuxfoundation.org/meetings/opnfv-testperf/2016/opnfv-testperf.2016-12-01-15.00.html
|weekly: generic test full||X||X|
weekly: Tempest and Rally full (OpenStack)
daily: Yardstick Tier 2 (should be daily)
Vsperf, storperf Lite test case
(after they integrate with yardstick)
daily: Doctor, promise, bgpvpn, security_scan,..)
daily: Yardstick Tier 1
( HA, IPV6, SFC, KVM,...)
daily: vPing, tempest smoke, Rally Sanity
daily: Yardstick Tier 0
(Lite generic test)
A dedicated page has been created. The goal is to be able to sa, for a given test case, if the test is passed or failed.
Test criteria can may be different according to the test case, Tempest smoke criteria is 100% of the tests are OK, performance tests may defined one or several thresholds.
Test cases can be developed by the test projects and/or by the feature projects.
Feature projects are responsible of their own tests as well as the associated troubleshooting.
Test projects are in charge to run their own test cases and help feature projects to be integrated in CI to meet test criteria for the release.
The test coverage is not an easy task. Test projects must focus on NFVI, however, in order to test efficiently NFVI, it may be useful to test VNFs (which are out of OPNFV scope).
We may suggest several views:
- per domain
- per component
- per ETSI domain
See dedicated testing coverage page
Test dashboards shall give a good overview of the different tests on the different PODs.
For Brahmaputra 3 dashboards have been created:
- Linux Foundation Dashboard using test DB dataset Through the test API
- Functest home made js dashboard using test DB dataset Through the test API
- Grafana portal using dataset pushed into influxDB for Yardstick
In Colorado, the target dashboards are:
- ELK (Elasticsearch/Logstash/kibana) as an evolution of the Functest home made js dashboard
- Grafana portal using dataset pushed into InfluxDB
Basically if you are graphing test status, it will be recommended to use the first option. If you need to graph results = f(time) for longer duration test and visualize bandwidth, lattency, ..evolution, it will be recommended to use InfluxDB/grafana.
Contact the test working group for any question.