Verify Feature Is Installed
[Documentation] Will Succeed if the given ${feature_name} is found in the output of "feature:list -i"
[Arguments] ${feature_name} ${controller}=${CONTROLLER} ${karaf_port}=${karaf_shell_port}
- ${output}= Issue Command On Karaf Console feature:list -i ${controller} ${karaf_port}
+ ${output}= Issue Command On Karaf Console feature:list -i | grep ${feature_name} ${controller} ${karaf_port}
Should Contain ${output} ${feature_name}
+ [Return] ${output}
Verify Feature Is Not Installed
[Documentation] Will Succeed if the given ${feature_name} is NOT found in the output of "feature:list -i"
[Arguments] ${feature_name} ${controller}=${CONTROLLER} ${karaf_port}=${karaf_shell_port}
- ${output}= Issue Command On Karaf Console feature:list -i ${controller} ${karaf_port}
+ ${output}= Issue Command On Karaf Console feature:list -i | grep ${feature_name} ${controller} ${karaf_port}
Should Not Contain ${output} ${feature_name}
+ [Return] ${output}
Issue Command On Karaf Console
[Documentation] Will execute the given ${cmd} by ssh'ing to the karaf console running on ${CONTROLLER}
${num_of_unique_macs} 10000
${cbench_system} ${MININET}
${cbench_executable} /usr/local/bin/cbench
+${throughput_results_file} throughput.csv
+${latency_results_file} latency.csv
*** Testcases ***
Cbench Throughput Test
[Tags] throughput
[Timeout] ${test_timeout}
Log Cbench tests using ${loops} iterations of ${duration_in_secs} second tests. Switch Count: ${switch_count}. Unique MACS to cycle: ${num_of_unique_macs}
- Run Cbench And Log Results -t -m ${duration_in_ms} -M ${num_of_unique_macs} -s ${switch_count} -l ${loops} ${throughput_threshold}
+ Run Cbench And Log Results -t -m ${duration_in_ms} -M ${num_of_unique_macs} -s ${switch_count} -l ${loops} ${throughput_threshold} ${throughput_results_file}
Cbench Latency Test
[Documentation] cbench executed in default latency mode. Test parameters have defaults, but can be overridden
[Tags] latency
[Timeout] ${test_timeout}
Log Cbench tests using ${loops} iterations of ${duration_in_secs} second tests. Switch Count: ${switch_count}. Unique MACS to cycle: ${num_of_unique_macs}
- Run Cbench And Log Results -m ${duration_in_ms} -M ${num_of_unique_macs} -s ${switch_count} -l ${loops} ${latency_threshold}
+ Run Cbench And Log Results -m ${duration_in_ms} -M ${num_of_unique_macs} -s ${switch_count} -l ${loops} ${latency_threshold} ${latency_results_file}
*** Keywords ***
Run Cbench And Log Results
- [Arguments] ${cbench_args} ${average_threshold}
+ [Arguments] ${cbench_args} ${average_threshold} ${output_filename}=results.csv
${output}= Run Command On Remote System ${cbench_system} ${cbench_executable} -c ${CONTROLLER} ${cbench_args} prompt_timeout=${test_timeout}
Log ${output}
Should Contain ${output} RESULT
${stdev}= Set Variable ${result_value_list[${3}]}
${date}= Get Time d,m,s
Log CBench Result: ${date},${cbench_args},${min},${max},${average},${stdev}
- Should Be True ${average} > ${average_threshold} Flow mod per/sec threshold was not met
+ Append To File ${output_filename} ${min},${max},${average}\n
+ Should Be True ${average} > ${average_threshold} ${average} flow_mods per/sec did not exceed threshold of ${average_threshold}
Cbench Suite Setup
+ Append To File ${latency_results_file} LATENCY_MIN,LATENCY_MAX,LATENCY_AVERAGE\n
+ Append To File ${throughput_results_file} THROUGHPUT_MIN,THROUGHPUT_MAX,THROUGHPUT_AVERAGE\n
${duration_in_ms} Evaluate ${duration_in_secs} * 1000
Set Suite Variable ${duration_in_ms}
##Setting the test timeout dynamically in case larger values on command line override default
${test_timeout} Evaluate (${loops} * ${duration_in_secs}) * 1.5
Set Suite Variable ${test_timeout}
- #Verify File Exists On Remote System ${cbench_system} ${cbench_executable}
+ Verify File Exists On Remote System ${cbench_system} ${cbench_executable}
Should Be True ${loops} >= 2 If number of loops is less than 2, cbench will not run
Verify Feature Is Installed odl-openflowplugin-drop-test
Issue Command On Karaf Console dropallpacketsrpc on