March 2015 HIPNet Meeting Notes
In summary, HIPNet invited the USAID Chief Data Offer Brandon Pustejovsky, Global Health Informatics Advisor Donna Medeiros, Digital Knowledge Asset Advisor John Liebhardt, USAID Public Affairs Advisor Jane Silcock, and USAID Evaluation Technical Advisor Amani Selim to attend the March 20 HIPNet meeting.
We covered the entire data cycle, first starting with data collection and submission according to USAIDs Open Data Policy (ADS 579), organizational data mangement and growing opportunities for analysis, to presenting findings based on evidence, and selecting the right USAID communication channel so that it will be used. There was lively discussion, many questions, and solutions offered at the HIPNet table.
USAID Data Policy Panel
· USAID Data Policy, ADS 579, focuses on making data freely available and structured for analysis.
· HIPNet encouraged organizations to ask, “How do you want to leverage data as an asset?”
· ADS 579 is in line with President Barack Obama’s Executive Order – Making Open and Machine Readable the New Default for Government Information. By default, information paid for by USG is to be machine readable.
· Collaboration, adaptation, and learning cannot be done without data.
· We can explore how we leverage data together to benefit development efforts.
· ADS 579 requires data to be collected and submitted to the Data Development Library by USAID funded mechanisms in place as of October 1, 2014. It does not apply to USAID mechanisms closed before October 1, 2014. The AOR/COR will determine allowable costs for data collection and submission.
· Answers to many questions can be found here at USAID’s Open Data Policy Frequently Asked Questions (FAQ).
· Capture experiences implementing the data policy to share with the community.
· Take an organizational approach to data management.
· Increasingly connect people to data.
· Implementing the policy will be a challenge, though there are major opportunities for analysis and informed decision making down the road.
· Intrahealth first understood USAID’s Program Cycle upon being granted a new award.
· At project start-up, the contracts, digital assets, and monitoring and evaluation teams met with the program team to discuss data quality standards and how to handle mixed-method evaluations to develop a data management approach.
· The organization developed a process for assigning datasets a description, metadata, tags, licenses, data quality notes, machine readable formats, data types and supporting documentation.
· Organization is now defining its data governance approach, how to prepare for post 2015, creating organizational policies.
· John’s background as a librarian is prime for data management.
· John found the USAID Thesaurus was extremely helpful for classifying datasets.
USAID Communication Channels and Advice on Presenting Findings
· USAID now accepts a pitch, rather than story. Communication team will determine the best channel to send the story through – Impact blog, Frontlines, Transforming Lives, internal comms, social media, etc.
· If you would like to share your story through a specific channel, your Bureau of Global Health Communication Point of Contact can work with you to develop the story for that channel. A pitch must be sent first.
· Jane shared a new Impact Blog submission form. Blog posts should be written in 1st voice, opinion, and by an authority on the subject-matter.
· Unique headlines are attractive, such as those starting with “Top 5…”
· Frontlines magazine is no longer in print rather is electronic only. Feature stories continue to have a human-centered focus.
Amani Selim, USAID Evaluation Technical Advisor – Quality of Evaluation Reports (presentation)
· 2011 USAID Evaluation Policy addresses decline and quantity of evaluation at USAID.
· Quality of evaluation is important for Agency credibility and use of findings.
· The evaluation process is rigorous and should objectively evaluate what worked in the project, what did not and why.
· Information on all components to be included in evaluations where shared including: scope of work; and explanation of the evaluation methodology; and evaluation tools.
· Evaluation criteria can be followed more closely, such as: evaluation findings should be presented as analyzed facts, evidence and data not based on anecdotes; recommendations to be supported by a specific set of findings; and recommendations should be action-oriented, practical and specific, with well-defined responsibility for the action.
· Amani opened the floor for discussion by asking “How do we present the data?” to which HIPNet members shared their best practices to include:
o Involving data stewards by presenting the data to them and capturing their interpretation; and
o Data visualization.
Our next HIPNet meeting will be held in June. Stay tuned for the date. Please let us know if you have specific topics that you would like to hear about or would like to present!