Improving OMRS to DHIS2 Integration

Background

We head from many implementing partners that they have developed short-term solutions to DHIS2 integration, but are very keen to try a more robust approach. Reporting requirements continue to increase for HIV and other diseases, while indicator definitions change regularly - meanwhile, COVID reporting standards as starting to emerg, with the ADX standard publishing COVID reporting guidelines in the coming weeks (based on the weekly WHO reporting criteria).

To help implementers prepare for and respond to the need for quality data in this pandemic and future ones, we are prioritizing DHIS2 integration, both for HIV and for COVID purposes. 

Goal: improve the OMRS<-->DHIS2 pipeline in an extensible way that solves the top OMRS/DHIS2 pain-points for implementers, especially:

  • Mapping period indicator reports

  • import/export workflows (e.g. import concepts)

  • concept creation and mapping

  • mapping cohorts

We are working to bring previous improvement work together, with more support for a meaningfully encoded data structure (e.g. using ADX) between the two systems, and to get the solution to a point where we can show implementers how to use this and how to start sharing data.

Notes: https://notes.openmrs.org/covidsquad2020

Work Underway

A. Compare existing Modules handling DHIS2 integration

B Reconcile DHIS2 Reporting module branches

Consolidated summary for Implementers to share & give feedback on

Technical focus:

  • MVP Working Product - DHIS2 MVP Timeline here  

    • Use CODE values instead of UIDs (reconciling/merging)

    • Use DHIS2 friendly terms (at dataset level)

      • Define a standard set of concepts/terms to use

    • Import datasets using a connected DHIS2 instance (prevents having to write manually)

    • Support multiple periods types

    • Map OMRS locations with DHIS2 Organisation Unit codes

  • Next:

    • Develop the UI using Micro Frontends - tech & updated UI

    • Add custom SQL Queries support 

      • Caveat: Validation not available

      • James and Jayasanka to follow up on workflow

    • Add Export functionality (export SQL-mapped dataset from the module)

      • Caveat: Issues if concept dictionaries used are not the same. So, only for examples where concept dictionaries are standardized

    • Import datasets using the exported file from the module

  • Future:

    • Push data to DHIS2 automatically

    • Store multiple DHIS2 connections





Detailed Requirements

  1. Use CODE values instead of UIDs

    By using CODE values help to make the module completely independent from DHIS2 instances. It helps to send a single dataset among multiple DHIS2 instances. Moreover, this is helpful when creating an export file that can be exchanged across OpenMRS instances. This format should be used when sending data to DHIS2: https://docs.dhis2.org/master/en/developer/html/webapi_adx_data_format.html

    1. adx branch had this implemented 

  2. Use DHIS2 friendly terms

    To make this familiar to the users as well as to the developers, it’s better to use DHIS2 friendly terms. Ex: DataSet, DataElement

  3. Add custom SQL Queries support

    Custom SQL gives the flexibility to be independent of the OMRS Reporting module when needed. This is helpful when it comes to export functionality. (read “Add Export functionality” section) The queries should be easily added through the module’s UI. However, these queries should be validated before saving it on the database.

    When mapping a dataset, the admin should choose the mapping type first; 1. Period Indicator Report, 2. SQL queries. Which means one dataset can have one mapping type. Not both. Since the period indicator reports support custom SQL queries, there’s no point in mixing both. Only the SQL mapped datasets can be exported using the export functionality. The custom SQL queries are there to support to distribute the mappings among multiple OMRS instances. (read “Add Export functionality” section)

  4. Add Export functionality

    A SQL mapped dataset should be able to export in a specific format that can be reimport for another OpenMRS instance. The export file should contain the following,

    1. available data value templates of the dataset

    2. mapped SQL queries for data value templates

    3. metadata of the dataset

    In my humble opinion, exporting the mappings between data value templates and report indicators doesn’t make sense because the report creation is a manual process.

  5. Import datasets using the exported file from the module

    This feature is extremely helpful in situations such as the COVID-19 pandemic. A central point of OMRS can provide the following for the implementors.

    1. An import file for the module (generated using the module) that contains SQL mappings

    2. A standard dataset metadata import file that can be imported to DHIS2 instance

    Implementors can easily import these files to the module and to the DHIS2 instance; they are ready to send data with zero-configurations within less than one minute.

  6. Import datasets using a connected DHIS2 instance

    DHIS2 has a friendly API that provides vast facilities including an API to import dataset directly. This feature is helpful to avoid writing manual import files. Moreover, this feature saves time as well as avoid mistakes that can happen when writing a manual import file to the module. If a central point of OMRS need to provide an import file for multiple instances, all they have to do is the following,

    1. Create a DHIS dataset with standard codes

    2. Import it directly from the module using a single click within the UI

    3. Add SQL mappings

    4. Generate the export file for the dataset

  7. Map OMRS locations with DHIS2 Organisation Unit codes

    There should be a UI to save the DHIS2 Org Unit code for available OMRS locations. This can be achieved either using an input box to enter the code manually or using DHIS2 API to fetch Org Units. Since there can be multiple DHIS2 connections, using an input box would be ideal.

    1. When we send data to DHIS2, we need to provide the specific code (or UID) of the DHIS2 Organisation Unit. Therefore, we have to store these OrgUnit mappings somewhere.

      This achieved in different ways within those implementations,

      1. DHIS2 connector module
        You have to select both OpenMRS location and the corresponding DHIS2 OrgUnit each time you posting data.

      2. DHIS2 reporting module
        You can map locations using the location using Map Locations view. The module loads the available DHIS2 org units using the DHIS2 API. Respective code value is being stored within the OMRS location as a location attribute. (adx branch uses codes. GSoC branch use UIDs)

      3. UgandaEMR

        Uid of the OrgUnits are hardcoded.

      Considering the above,

      • asking the user to select the org unit each time seems redundant.

      • If we deal with multiple DHIS2 instances, using DHIS2 API to fetch locations might not be practical

  8. Support multiple periods types

    There are 18 Period Types available at DHIS2 (Including 4 alternative weekly types). Refer to the DHIS2 Documentation. The module should support all the available period types. Apart from that, the period selector in the post dataset view should be changed according to the period type of the dataset. Ex: If the period type of the selected dataset is monthly, it should display a month picker. Furthermore, it should include validations withing the both frontend and the backend.

  9. Push data to DHIS2 automatically

    The module should support automatic pushes as well as the manual pushes. This should be much simple as adding a repeating event in the google calendar.

  10. Store multiple DHIS2 connections

    The module should be able to store multiple DHIS2 connections. (Assuming this is a functional requirement due to the feedback received in the previous calls) All the connections should be stored in a separate table in the database. This is essential when it comes to the auto-push option.

    1. Can address with interoperability layer, mediator like OpenHIM

      1. How would we handle auto-pushing data in that case?

  11. Develop the UI using Microfrontends

    This is not a functional requirement at the moment. If we need to deliver this fast, we can still leverage the old JSP UI because it is almost completed. But, in the future, I think it’s better to rewrite the UI using micro frontends to provide a better UI for the users. 

    1. Data would need to be exposed through REST



Helpful Reading