Page tree
Skip to end of metadata
Go to start of metadata

Admin

weekly meeting

Test community @ OPNFV Summit Berlin

Colorado Testing

Danube Testing

E Testing

Introduction

Testing is still a key challenge for OPNFV.

All the projects must manage their test strategy (unit, functional, security, performance)

Several specific test projects have been validated by TSC and already deal with:

  • Define testcases
  • Perform tests not covered by a single project
  • Create tooling
  • Study Performance end to end

OPNFV Test ecosystem

There are several projects dealing with integration and testing (see https://wiki.opnfv.org/).

Overview

A global overview can be described as follow:

See Testing Ecosystem details on the elaboration of the figure

Project details

We consider the projects referenced on the wiki main page:

Project name

Scope

Functest

VIM and NFVI funcionnal testing

Umbrella project for functional testing

Yardstick

Verification of the infrastructure compliance when running VNF applications.

Umbrella project for performance testing

Storperf

Storage Perfomance testing

VSperf

Data-plane performance testing

CPerfController performance testing

Bottlenecks

Detect Bottlenecks in OPNFV solution

QTIPPlatform Performance Benchmarking
DovetailTest OPNFV validation criteria for use of OPNFV trademarks

All the test projects are closely connected to additional projects:

  • PharosThe Pharos project deals with the creation of a federated NFV test capability that is geographically and technically diverse and hosted by different companies in the OPNFV community. This requires developing a baseline specification for an OPNFV "compliant" test environment along with tools, processes and documentation to support integration, testing and collaborative development projects with needed infrastructure and the tooling.
  • Releng: release engineering that deals with git/gerrit and jenkins management

The procedure to get support from test projects:

  1.  Contact the testing group (weekly meeting / mail / #opnfv-testperf / ...)
  2.  Declare your project in the test DB: http://testresults.opnfv.org/testapi/test_projects
  3.  Declare your test cases in the DB: http://testresults.opnfv.org/testapi/test_projects/doctor/cases
  4.  Provide your constraint (scenarios/installers) e.g. doctor => Apex only
  5.  Provide your test success criteria* e.g. doctor final status should be PASS
  6.  Develop the code of the tests https://git.opnfv.org/cgit/doctor/tree/tests
  7.  Create JIRA in Functest/Yardstick/xPerf for integration
  8.  Work with test team on integration (CI pipeline, dashboard, …)
  9.  Troubleshoot on the different scenarios
  10.  Document your tests

 

Definitions

As OPNFV is dealing with lots of testing projects, it could make sense to share some definitions accross the different projects.

Testcase definition

A test case is a script performing one or several actions on the System Under Test, collecting the results of these actions to provide a global or action per action results.

A test case must be declared in the test database: 

Note feature project may obviously declare their own cases, e.g:

to get the list of the projects that declared test cases: http://testresults.opnfv.org/test/api/v1/projects

it is possible to use the swagger interface to add projects/cases: http://testresults.opnfv.org/test/swagger/spec.html (see http://artifacts.opnfv.org/functest/colorado/docs/devguide/index.html for details)

The test API is automatically documented (since Danube) http://artifacts.opnfv.org/releng/docs/testapi.html

 

Tiers

Agreed During testing meeting 1/12: http://ircbot.wl.linuxfoundation.org/meetings/opnfv-testperf/2016/opnfv-testperf.2016-12-01-15.00.html

LevelCategoryFunctestYardstickVSPerfStorperfBottleneckQTIP
8Other    XX
7In Service      
6Stress
      
5VNFweekly: vIMS     
4Performance

N.R

weekly: generic test fullXX  
3Components


weekly: Tempest and Rally full (OpenStack)

daily: Yardstick Tier 2 (should be daily)

Vsperf, storperf Lite test case

(after they integrate with yardstick)

    
2Features

daily: Doctor, promise, bgpvpn, security_scan,..)

daily: Yardstick Tier 1

( HA, IPV6, SFC, KVM,...)

    
1Smoke

daily: vPing, tempest smoke, Rally Sanity

daily: Yardstick Tier 0

(Lite generic test)

    
0HealthcheckgatingN.R    

 

Test criteria

A dedicated page has been created. The goal is to be able to sa, for a given test case, if the test is passed or failed.

Test criteria can may be different according to the test case, Tempest smoke criteria is 100% of the tests are OK, performance tests may defined one or several thresholds. 

Troubleshooting

Test cases can be developed by the test projects and/or by the feature projects. 

Feature projects are responsible of their own tests as well as the associated troubleshooting.

Test projects are in charge to run their own test cases and help feature projects to be integrated in CI to meet test criteria for the release.

Test coverage

The test coverage is not an easy task. Test projects must focus on NFVI, however, in order to test efficiently NFVI, it may be useful to test VNFs (which are out of OPNFV scope).
We may suggest several views:

  • per domain
  • per component
  • per ETSI domain

See dedicated testing coverage page

Test Dashboards

Test dashboards shall give a good overview of the different tests on the different PODs.

For Brahmaputra 3 dashboards have been created:

In Colorado, the target dashboards are:

  • ELK (Elasticsearch/Logstash/kibana) as an evolution of the Functest home made js dashboard
  • Grafana portal using dataset pushed into InfluxDB

Basically if you are graphing test status, it will be recommended to use the first option. If you need to graph results = f(time) for longer duration test and visualize bandwidth, lattency, ..evolution, it will be recommended to use InfluxDB/grafana.

Contact the test working group for any question. 

References

Brahmaputra Testing (deprecated)

Test Dashboard wiki

Traffic profiles page

  • No labels

6 Comments

  1. Morgan Richomme where can I find an editable version of the overview picture? I want to propose a modification to update QTIP status.

    1. https://git.opnfv.org/functest/plain/docs/com/img/OPNFV_testing_group.png

      but it is not editable (sad) 

      I was not able to find it somewhere in a ppt, it will be quicker to redo it ...or you can edit it in any drawing tool

      1. Alright. Let me try to reproduce it with an editable format.

      2. Replaced with an editable diagram. Please check.

  2. I suggest the "Perf&Benchmark " of testperf ecosystem should be replaced with "Score&QPI",

    1. I thought we should not change a project's scope without TSC's approval.

Write a comment…