-
Notifications
You must be signed in to change notification settings - Fork 90
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ci: add performance script to PR jobs #3274
Conversation
- currently requires running tests to setup db (waiting to avoid actually running tests and spinning down tests/env) - loses ability to run k6 against rafiki w/ telemetry
This reverts commit 1a6c0e6.
- in preparation for switching performance test to target testenv - ensured integration tests pass
- supports reuse in setup script for performance tests which will allow running performance test against testenv - also bumps graphql pacakge version across monorepo. added to test-lib which caused some type errors in backend and auth due to a version mismatch
✅ Deploy Preview for brilliant-pasca-3e80ec canceled.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor requests / updates. Rest looks good. 👍
test/performance/config/local.env
Outdated
C9_OPEN_PAYMENTS_PORT="3000" | ||
C9_GRAPHQL_PORT="3001" | ||
HLB_OPEN_PAYMENTS_PORT="4000" | ||
DOCKER_NETWORK="rafiki_rafiki" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing new line.
test/performance/config/test.env
Outdated
C9_OPEN_PAYMENTS_PORT="3100" | ||
C9_GRAPHQL_PORT="3101" | ||
HLB_OPEN_PAYMENTS_PORT="4100" | ||
DOCKER_NETWORK="rafiki-test_rafiki-test" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Missing new line.
export function handleSummary(data) { | ||
const requestsPerSecond = data.metrics.http_reqs.values.rate | ||
const iterationsPerSecond = data.metrics.iterations.values.rate | ||
const failedRequests = data.metrics.http_req_failed.values.passes | ||
const failureRate = data.metrics.http_req_failed.values.rate | ||
const requests = data.metrics.http_reqs.values.count | ||
|
||
const summaryText = ` | ||
**Test Configuration**: | ||
- VUs: ${options.vus} | ||
- Duration: ${options.duration} | ||
|
||
**Test Metrics**: | ||
- Requests/s: ${requestsPerSecond.toFixed(2)} | ||
- Iterations/s: ${iterationsPerSecond.toFixed(2)} | ||
- Failed Requests: ${failureRate.toFixed(2)}% (${failedRequests} of ${requests}) | ||
` | ||
|
||
return { | ||
// Preserve standard output w/ textSummary | ||
stdout: textSummary(data, { enableColors: false }), | ||
'k6-test-summary.txt': summaryText // saves to file | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could also save a summary HTML report via:
import { htmlReport } from 'https://raw.githubusercontent.com/benc-uk/k6-reporter/main/dist/bundle.js';
// Report handling
/**
* Report Generation
* Handles the generation of test result reports
* @param {Object} data - Test result data from K6
* @returns {Object} Report configuration object
*/
export function handleSummary(data) {
return {
'reports/summary.html': htmlReport(data)
};
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like it!
@koekiebox Tried out the html report. Looks nice. I dont see a straightforward way to get in the PR. Perhaps it could be used locally, although I think in that case the cli has the same info and you dont have to go hunt it down. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Changes proposed in this pull request
MockASE
handles seeding and hosting integration endpoints required for rafiki to properly function (rates, webhooks). This is located in the integration tests inmain
so I moved it out for re-use when running performance tests.main
run against our localenv. I changed it to be configurable by parameterizing the run script and test and reading from a config to get the correct environment details (network name, urls). So now you could run performance tests locally against the testenv without interference with your localenv. And can still run against localenv if you want to use the grafana dashboard.The PR comment got buried here (shouldnt happen normally, will be one of the first comments w/ netlift): #3274 (comment)
Context
fixes: #3240
Checklist
fixes #number
user-docs
label (if necessary)