How you deal with test coverage in your project. E2E, functional, unit, so on.
Hello dear community!
Let's have some test automation related discussion to share your best practices.
As everyone knows that every feature should be tested. In Spryker we follow concepts of TDD
To balance between full test coverage and development speed I usually prefer to fit implementation process with agile testing pyramid:
Let's discuss what approach is used in your project. Example questions for start from my perspective:
How you deal with outdated test checks for features in your project?
- Developer is responsible to fix that within new MR implementation. QA automation is responsible to align about test changes with other teams.
Do you work with E2E tests, how do you maintain them?
- Yes, but we have dedicated team for that.
Do you have manual QA in your project?
- Yes, useful when we need to have regression/UI tests.
What happens if acceptance test is failed after new implementation?
- QA developer is responsible to fix that.
How do you know how well feature is covered for now?
- One of implementation ways that I like: To have test automation checks in tasks tracking systems like JIRA and rules to connect test code within those tasks. We can use tack tracking system plugins (like JIRA plugins) to track test automation feature coverage between releases.
What happens if all tests are blocking continuous delivery like pipeline fails in development branch (by gitflow
- I think in ideal world we should have an automated process that allows us to have a dashboard with all tests healthcheck and ability to assign responsible person to fix that issue. So it's allows us to know without any extra interaction: what is happened in test, issue is related to my changes (happens only in my branch), contact person and even, progress/status for that.
Comments
-
Hi @Serhii Doroshenkov! In our company it's mainly developers in charge of creating e2e tests. We also do unit tests for our backend, I like the concept of having 100% code coverage check in the pipeline, so any 'unnoticed' code won't slip by (developers have to explicitly whitelist what doesn't need a unit test).
We recently started having back again a QA specific engineer, but we still have to polish the final flow.
I think periodic, manual, QA is beneficial, as a sanity check for the project, say in checkpoints or cycles.2
Categories
- All Categories
- 42 Getting Started & Guidelines
- 7 Getting Started in the Community
- 8 Additional Resources
- 7 Community Ideas and Feedback
- 69 Spryker News
- 894 Developer Corner
- 757 Spryker Development
- 83 Spryker Dev Environment
- 361 Spryker Releases
- 3 Oryx frontend framework
- 34 Propel ORM
- 68 Community Projects
- 3 Community Ideation Board
- 30 Hackathon
- 3 PHP Bridge
- 6 Gacela Project
- 22 Job Opportunities
- 3.2K 📜 Slack Archives
- 116 Academy
- 5 Business Users
- 370 Docker
- 551 Slack General
- 2K Help
- 75 Knowledge Sharing
- 6 Random Stuff
- 4 Code Testing
- 32 Product & Business Questions
- 68 Spryker Safari Questions
- 50 Random