2014-01-16 Developers Forum
How to Join
Agenda
Quickly review previous meeting minutes (5 min)
Release Process
Testing/Countinous Delivery
After-action review & next week's agenda (5 min)
Minutes
Developers Forum 2013-01-16
Recording:
Attendees
Lauren Stanisic
Michael Downey
Roger Friedman
Burke Mamlin
Jeremy Keiper
Ryan Yates
Daniel Kayiwa
Mhawila Mhawila
Elliott Williams
Rafal Korytkowski
Wyclif Luyima
Paul Biondich
Chris Power
Darius Jazayeri
Suranga K
Nyoman Ribeka
Steve Githens
Agenda & Minutes
Review Last Week's TODOs
TODO: Need some clear boundaries/goals/stakes in the Order Entry Sprint ground/timelines
Reschedule topics: Road Map & Implementers Wist List (with Chris Power available)
TODO: Is there way to run SonarQube against pull requests (a la Travis CI) rather than commits?
TODO: Create a JIRA issue related to dbunit.dataset.NoSuchTableException errors from Bamboo/SonarQube (Darius)
Order Entry Sprint Showcase (Wyclif & Burke)
Release Process
?
Testing/Continous Delivery
Follow up on http://notes.openmrs.org/Developers-Forum-2013-05-09
QA Recommendations for 2013 were:
Emphasize the importance of automated testing at all levels (unit, service, integration, performance). Create a robust testing pyramid
Concern that current testing approach is somewhat cumbersome and focused on component/data interactions
Include acceptance criteria review/creation in JIRA curation activities; acceptance criteria and test case scenarios should be mandatory
Explore ways to formalize the core<-->module contract using Integration Contract Tests.
Publish/Subscribe contract tests in component pipelines
Execute module tests against all supported versions of core; publish test results for community consumption
Adopt a scenario based testing approach to capture high-value paths through the system [especially from implementations]
Consider adopting BDD testing tooling; leverage community expertise to increase scenario coverage?
QA Recommendations for 2014 were:
Manage cycle times to keep feedback loops optimized
<5m from commit for unit & component tests
<60m from commit for service & integration tests
Addition of additional deployment targets & test clients may be necessary [parallel execution]
Consider automatic inclusion of community modules in pipeline
Goals for CI/CD
(stated 2013 CD goals were improvements in Configuration Management, Quality Assurance, and Environments & Deployments)
A clear list of community-supported modules and some clearly defined criteria for why modules are or are not in the list
Is it just what's in the reference application?
TODO: define this process and vet with community (Burke/Darius) along with what gets tested
What are the intended targets for CD/CI?
Scope: Reference application (API + platform + refapp modules + community supported modules included in OpenMRS 2.0+)
What are community supported modules?
Improving the development process (quicker feedback to devs)
Improving the code (reducing defects, increasing quality & reliability)
More automation (always increasing amount of automation)
Making "nightly" (current) builds available