2015-04-23 Developers Forum
How to Join
Agenda
Quickly review previous meeting minutes (5 min)
Review next meeting agenda
Minutes
OpenMRS Developers Forum 2015-04-23
Recording: http://goo.gl/Sz2ILH (Audio)
Attendees
Burke Mamlin
Michael Downey
Tim Nicholson
Rafal Korytkowski
Tharunya _
Tomasz Mueller
Darius Jazayeri
Willa Mhawila
Saptarshi Purkayastha
Ada
Daniel Kayiwa
Manika Praveenkumar Maheshwari
Ryan Yates
Maurya
Karl Wurst
Agenda & Notes
Review last week TODOs
TODO: Burke to notify presenters about their dev forum topics
TODO: Someone to Get MariaDB support on the roadmap <http://om.rs/roadmap>
TODO: Someone to enumerate which of our distributed modules (those in the 2.2 refapp <https://openmrs.atlassian.net/wiki/x/rg1IAQ>) are not hibernate and/or liquibase
TODO: Someone to Set up CI builds for MariaDB, PostgreSQL, and SQLServer, Oracle (We assume these will not be green lights)
TODO: Someone to Create tickets to address problems that occur for alternate DBs
TODO: Someone to Schedule sprint(s) to address DB tickets
QA Systems (load testing and performance testing)
Follow up from the last wave of this topic: IU no longer allows non-IU systems to use NeoLoad & Dynatrace. :-(
Functional Testing
JBehave
Need to be "smart" about functional testing, since these can be brittle, so need to focus on high value tests and create them in a way that is as robust as possible
Example of a JIRA issue to spec out a functional test: https://mifosforge.jira.com/browse/MIFOSTEST-1195
In the past we used Selenium/WebDriver: http://jbehave.org/reference/web/stable/using-selenium.html
We have incoming volunteers who aren't devs but are interested & willing to do manual functional testing ... unfortunately we don't usually have much for them to do
Acceptance Testing
Nice guide for ensuring the UI is ready for automating acceptance tests: https://mifosforge.jira.com/wiki/display/MIFOS/Writing+Acceptance+Tests
Performance Testing
OpenMRS 1.8 focused on performance improvements and, for that relese, we set up a CI build onto a spare server (non-VM) to generate a report of performance metrics (e.g., patient search, dashboard load time, etc.).
Is this a priority for us? Would we like to have build tests that check performance? Break builds if performance metric doesn't meet goal?
Implementations notice quickly with patient search takes 2 minutes, but we may not need a fancy, automated system to address these issues (in terms of priorities)
There are likely higher priorities – e.g., functional testing, testing against other databases, testing deployments (large db, upgrades, etc.)
Cloud-based service that use things like JMeter, e.g.: http://blazemeter.com/
What did Soldevelo and the Mifos team establish for their app? Sequence: Functional Testing, (Automated) Acceptance Testing, Performance Testing, Release Testing
Load Testing
How many concurrent users does OpenMRS support (on specific server specs)?
Darius: Lower in the priority list
Corollary: How many concurrent users do most large implementations have, and what type of work are those users doing?
Unit Testing
We feel like we're doing reasonably well
Should probably do more mocks, so testing is more reliable & runs faster
How long should it take to run unit tests? How long is "too long" (when devs stop using skipTests by default)?
"If you have tests that are useful, but take longer than you want the commit suite to run, then you should build a DeploymentPipeline and put the slower tests in a later stage of the pipeline." - Martin Fowler http://martinfowler.com/bliki/UnitTest.html
"Ideally, you're just running the unit tests specific to the changes your are making while developing, then only need to run the full suite once before committing code." -Darius Jazayeri ( http://maven.apache.org/surefire/maven-surefire-plugin/examples/single-test.html)
Expect devs to run their tests using IDE as they develop.
Expect devs to run full suite of tests before committing code.
TODO: push for mocks over integration tests
TODO: document how to test within IDE in dev pages in wiki
Infrastructure update