Anuket Project

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 16 Next »

 

This page is intended as a working draft of how to execute test cases and generate report for the dovetail project.

 

Scripts arcitecture

Test cases  from upstream projects will be validated and work in conjunction with each other as an organic whole. To achieve this goal,  we need bunch of scripts to communicate with these test cases. The scripts can execute the test cases under certain circumstances, and then analysis variety results to make a human-friendly report .

 Figure: class diagram

 

 

Component

Description

status

config

basiclly the same env config as the config in functest/yardstick, used to indicate SUT information

Ongoing

Testcase file

testcase definition files, mapping to the testcase from functest/yardstick being executed. these testcases comprise the certification

Ongoing one by one

certification

certification definition files, to determine the scenario of the SUT, and a collection of testcases

Done, basic definition

parser

parse, load and check input files(config,certification,testcases) if they are valid

Done, basic function

Testcase keeper

Testcase management,  execution status , etc

Ongoing,

Container manager

all stuff around the container, like pull image, start container and exec cmd in container

Ongoing, need support yardstick

runner

main entry for running through the whole validating procedure

Ongoing

report

manage test results and generate reports from these results

Ongoing, just begin to work on it

dashboard

presentation of test report and show the details of each testcase

Optional

 

.

Definitions

certification scenario definition

  • certification_basic: scenario name
  • testcase_list: a list of test case referenced to test case defined in dovetail

testcase definition

  • dovetail.feature.tc003:  name of the test case,  comprised of 3 parts, the dovetail project, belongs to which feature, and the serial number
  • objective: a brief description of the test case, explains the objective of this test case
  • scripts: scripts and configurations for the execution of the test case
    • type: upstream testing project
    • testcase: the input parameter for the execution of the upstream testing scripts
    • sub_testcase_list: a list of real working test, indicated what's inside of this test case

Testcase report analysis

As our certification reports come from the report of test cases, we have to check those test cases one by one. And the most important data is the test case details. Test case details include its duration, test case names , etc. So i make a table to show the details of testcases.

  • duration: whether result has a duration of executing time
  • sub-testcase: whether result contains any sub-testcase
  • details on success: whether result contains details of sub-testcase, like testcase name, parent testcase, etc,  when pass
  • details on fail: whether result contains details of sub-testcase when fail

functest report analysis

Test-case

duration

sub-testcase

details on pass

details on fail

healthcheck

    

tempest_smoke_serial

Y

Y

N

Y

vping_ssh

Y

N

N

N

vping_userdata

Y

N

N

N

rally_sanity

Y

Y

N

N

Tempest_full_parallel

    
rally_full    

yardstick report analysis:

Well, yardstick is much simpler than functest, it has same result format for each test case. 

The most important part of yardstick test result is the metrics of every sequence which we do not need it at all.

We just need to know that the test case pass or fail.

Certification report

We now have a simple result list as the final report, and it looks like this. I know it should be cool and fancy, and we'll get it ...

 

  • No labels