How to Use Community-Level Data to Benchmark And Report Progress

July 2015

Educational attainment is a community-wide mission. Entire metropolitan areas and regions can enjoy both economic and civic benefits when more residents gain postsecondary education that prepares them to be valuable members of the workforce and society. Students engage with a variety of organizations and service providers as they prepare for college and career success, and no single provider can be responsible for the entire community’s education outcomes.

Community stakeholders are encouraged to share data about the outcomes of their services, programs, and initiatives with each other and with students and families. Open data promote accountability and action. As community stakeholders report data, everyone can study both the big picture and the details to learn how best to plug leaks in the education pipeline, use resources efficiently, and invest in underserved populations. With access to various data, like the retention and graduation rates of each local K-12 and postsecondary institution, and by using data from peer communities as benchmarks, communities can assess how they are doing, how far they have come, and how much further they need to go. Backbone organizations—the lead coordinating partners of local education initiatives—can take charge of convening partners to provide these data. Below are a few examples of tools that can be used to share communitylevel data:

Baseline Reports: Does your community want to let important stakeholders know where the community and its institutions currently stand in terms of moving various student populations through the education pipeline? These reports are often produced and disseminated at the start of a community-wide education initiative to share goals and timelines publicly (e.g., 60% of residents will have a postsecondary credential by 2020). Developing these reports can be essential exercises in bringing community partners to the table to determine shared goals, metrics, and definitions. They also enable partners to agree on a path moving forward to address the areas of improvement highlighted in the report.

Progress Reports: Does your community want to share progress made on key community indicators? These annual reports provide stakeholders the opportunity to transparently identify areas that are doing well or need new approaches or resources. As tools to both celebrate progress and motivate partners to drive toward greater improvement, they help maintain momentum in the community work.

Dashboards: Does your community want to offer readily accessible information to students of all ages and community members? Dashboards are online tools that do just this. They may be updated and modified at any time, can take many forms, and share a variety of indicators per the specific goals and focus of the community. A community that wants to improve educational outcomes among its African-American or Latino residents may post a variety of indicators by race or ethnic group; meanwhile, a county that is concentrating its resources on promoting retention and completion at its community colleges may provide more detailed data on these institutions. Interactive dashboards are highly customizable based on the topic of interest or the informational needs of the user. They are often excellent sources of disaggregated and longitudinal data.

This section of the guidebook features an interview with a community leader in Spokane, Wash. who explains how their initiative’s baseline report has been used to build their community partnership on a foundation of open data and shared accountability. Another interview with staff at 55,000 Degrees in Louisville, KY. explains how they developed their Interactive Educational Data Dashboard, and includes a Tableau software handout to explain how you can create a similar dashboard. Finally, this chapter ends with a list of additional resources where you can find more information on community-level data tools.

Spokane, Wash.: How to Promote the Sustainability of Your Attainment Initiative Using Baseline and Progress Reports

  • Amy McGreevy, M.Ed., Executive Director, EXCELerate Success

IHEP spoke with Amy McGreevy from EXCELerate Success, a new collective impact initiative in Spokane, Wash., about the organization’s baseline report and upcoming progress report. McGreevy says these reports have helped strengthen their partnership as local organizations and institutions establish shared data measurements and build a culture of trust and data-informed action. Read this interview to learn how these reports have made their partnership more sustainable and for advice on creating engaging reports with personalized stories from the community.

Goals

IHEP: EXCELerate Success published a baseline report in June of last year (2014). What did you hope to accomplish by developing this report?

The report provides us with a baseline regarding our community data indicators with which the entire EXCELerate Success partnership is concerned. The report gives us a window into what’s going on in Spokane County as it relates to students’ retention and completion at the postsecondary level, as well as many other indicators in the cradle-to-career continuum. We have used the report and its information as a starting point with our partners to talk about what really needs to change in order to improve outcomes for our students.

We’ll release an update on our indicators every year in our annual report; as we learn and refine our measures and indicators, the reports will look different. I’m hoping that every year the reports will give us a picture of the strategies that our network is using, and how different institutions are increasing students’ retention and completion rates throughout their college careers.

IHEP: At this stage, who is the primary audience for your data reports?

It’s for the benefit of our partners as well as the wider community. Our partnership is still pretty young and we’re still establishing and implementing different strategies. We’re starting by building the case for the cradle-to-career work with our partners. Our goal for EXCELerate Success is for the individual community members to relate to our work so that they feel connected to the larger movement, which is centered on improving the lives of all families in Spokane County.

Partnership

IHEP: Why are data reports—or data in general—important for gaining buy-in for your partnership’s work?

You can’t un-know data. You can think or feel a certain way; you have an intuitive feeling about what is probably going on with students. But with data, we can pinpoint it and make comparisons across the opportunity gap for different students. Data are not a call to action in and of themselves—that’s key. Data can tell you something, but you need to make them work for you in order for them to do something.

For example, we look at the retention rate for community college students in Spokane County. I want to know how the retention rates are different for non-traditional versus traditional students, for low-income students, for first-generation students. That will allow us to understand where to focus our work. We have five very different postsecondary institutions in the community, with different delivery methods and models. We are able to look at what each institution is doing and see if one college is moving the needle a bit faster than another.

It’s beneficial for institutions to be able to see their own data through a different lens, as if from an outside perspective. They are trying to work on these issues internally, and this is a different way to approach it.

Data are not a call to action in and of themselves—that’s key. Data can tell you something, but you need to make them work for you in order for them to do something.

IHEP: What role do the data reports play in bringing partners to the table for this cradle-to-career attainment work?

In our postsecondary attainment group, one of the struggles we had early on was trying to determine what exactly we mean by “retention.” How will we report this information, and do we have a shared definition across institutions? That has been a great opportunity to bring people together, especially across the different institutions, and to be able to bring the community colleges and four-year institutions together at the table to use the same definitions. Our data currently tell us that if a student leaves the community college to go to a four-year institution but they didn’t finish their AA, they are considered a drop-out. This is a common problem with tracking students from community colleges. We were able to bring the different institutions to the table and ask, “How are we, in Spokane County, going to track this information?” We were able to establish a definition across the institutions that says if a student is retained between institutions, and we know about it and can track it, then that student is considered successfully retained, not a drop-out.

What’s fantastic is that we were able to engage all the institutions, as well as many community-based organizations, in this process. We have CBOs that are working on counseling and coaching students through postsecondary education, so they can help with data collection, student tracking, and information sharing. The reports allow us to do exactly what collective impact is set up to do: align organizations along shared measures. Essentially, what we’ve done here is built a shared expectation, shared definition, and shared measures across partners.

IHEP: Are the data reports used for finding funding for the initiative?

Right now we’re trying to identify on-the-ground strategies to focus on. As we get closer to designing our strategies, and as we try to coordinate more resources, tools, and activities that are closer to the students themselves, we’ll talk to funders or different organizations about how to align resources, and data comes in handy then. We’ll say our data are telling us to increase our retention rate by this much to see significant improvements, and we have identified strategies A, B, and C to do that. So we will need that institution or external organization providing support to align resources and funding to these three strategies. But it’s not so much about finding additional funding as it is about utilizing the resources and funding we have in better ways.

The reports allow us to do exactly what collective impact is set up to do: align organizations along shared measures. Essentially, what we’ve done here is built a shared expectation, shared definition, and shared measures across partners.

Implementation

IHEP: How do you decide which indicators to include in your report? Are you sharing every indicator EXCELerate Success is tracking, or are you choosing a selection to share?

We are in the process of deciding that for our upcoming report, the second report. I anticipate we will focus on the priority areas where we’ve already established networks. Our four priority areas where we will have more in-depth data analyses are: kindergarten readiness, reading at grade level, high school completion, and postsecondary attainment. We’ll still try to pair the data with stories. And for the indicators that do not have established networks, we will probably use more story-telling to talk about what efforts are underway in the community.

IHEP: Let’s talk more about storytelling as a tool. How do you pair your data with narrative stories in your report, and what does that achieve?

It’s wonderful to look at numbers, but if we don’t pair them with actual examples and stories about the impact on students and families, the numbers alone fall flat. We care about people, so we need to make sure that the data always relate to a personal story. In our report, you’ll end up seeing the data paired with examples of on-the-ground impact and how it’s engaging the community in a significant way. When you talk about, say, what a different type of coaching model means to a non-traditional student trying to return to school, telling that story makes it more real for the partners to see their individual impact, as well as the collective impact of the group.

It’s wonderful to look at numbers, but if we don’t pair them with actual examples and stories about the impact on students and families, the numbers alone fall flat. We care about people, so we need to make sure that the data always relate to a personal story.

IHEP: Do you want different partners to have different takeaways from your reports, or do you expect all partners to come away with the same message?

It’s a little bit of both. On a higher level, I’d like everyone to walk away with the same message, about the ways in which we are trying to influence change and movement along these specific indicators or outcomes. I want all the partners to understand the broader message of EXCELerate Success and its purpose, but also to understand their individual purpose within this work. It’s going to look different for different partners, for good reason.

For example, one university has a very specific role within the postsecondary attainment work because of what it does with first-generation, low-income students; students with disabilities; and its multicultural programs. It has great advising models and instructional models aimed at impacting those students we’re trying to help retain in postsecondary work. We would want them to come away with an understanding that these are the things they directly connect to and this is how they could position themselves within the scope of EXCELerate Success.

IHEP: Communicating data effectively is key; how do you present a report that people are going to want to look at?

You need to use data in a way that strikes people, first and foremost. Make sure the data are visually clean, easy to understand, and connected to some sort of personalized narrative or story. You also have to make sure you are telling a story throughout the report that connects what’s happening in kindergarten readiness to what’s happening at the postsecondary level. The challenge in getting people to invest in early childhood efforts is that results are not going to be seen until 16 to 20 years later. You can throw graphs and information into a booklet and that’s fantastic, but to get people to read it, they need to see their kids in it, and they need to see their community and their work in it. For me the test is, if I looked at this report at the doctor’s office, would I want to pick it up?

IHEP: What about communicating data to each organization or institution? How are they prepared for what will be published?

If we’re going to include data in the report saying something about a partner, we will present that information to them early on to be respectful. At the same time, we won’t exclude information just because it makes a partner uncomfortable. We do want to make sure that the information we’re releasing is appropriate and that it actually tells the right story. That is why we want everyone to agree on a common definition or measure, so that when data come out, we can interpret them and describe them in a better, more consistent way. We don’t want to trick people with data.

It’s very important to be transparent and honest about how you’re using and interpreting data. It builds trust. When groups and people are nervous about using data, it’s often because there isn’t a trusting relationship built with those people who are reporting or working with the data.

All of our partners on whom we are reporting are at our table.

They’re on the leadership team, they’re in the networks. So really, they’re the ones who are probably going to help us pick out the data to report, and they understand that this is meant to serve a larger goal rather than to just represent their institution. Most of the information we’re presenting that is specific to a particular school district or institution is pretty widely and publicly known. It’s going to get a little bit tricky as we collect data that are more at a school level or neighborhood level. We need to make sure we are sensitive to the fact that we’re talking about real people and a real community.

We are going to have a struggle with our indicator for reading at grade level. Washington State moved to the Common Core State Standards, completely switching their assessment for third grade reading. We are anticipating very bleak reports back from that data. Everyone knows that and is aware of that. We know this, we’re going to report them anyway, but we’re also going to tell the story about all this work happening to make sure that next year is going to improve.

IHEP: Speaking of challenges, what is the biggest challenge you confront in reporting data and how do you overcome it?

It’s really easy to just talk about data; it’s hard to move groups of people to act on them. I think that is especially true in the postsecondary world, because we’re always afraid that we don’t have all the data. It’s like how we’re afraid we don’t always have all the right people at the table. No, we don’t always have all of right people and right information, but we do have lots of the right people and lots of the right information. We’re really trying to build a space in our community for people to take courageous leaps of faith, to make systemic changes that go against our comfort zones. At a certain point, we just have to act on the best knowledge and with the best people that we have at hand.

If we’re not a little bit uncomfortable, we’re not changing or growing. We should be working with what we know, but we also have to push ourselves a little. We build these invisible walls, and if we don’t break those down, we aren’t going to accomplish what we really want.

We’re really trying to build a space in our community for people to take courageous leaps of faith, to make systemic changes that go against our comfort zones. At a certain point, we just have to act on the best knowledge and with the best people that we have at hand.

Louisville, K.Y.: How to Report Data to Communities through an Interactive Data Dashboard

  • Mike Kennedy, Technology and Data Manager, 55,000 Degrees
  • Lilly Massa-McKinley, Ed.D., Senior Director of Project Management, 55,000 Degrees

IHEP spoke with Mike Kennedy and Lilly Massa-McKinley from 55,000 Degrees in Louisville, KY. about their interactive education data dashboard to learn how communities could develop similar tools. 55,000 Degrees is a public-private partnership with a mission to increase the number of postsecondary degrees in Louisville by 2020. Kennedy and Massa-McKinley provided background on their dashboard, describing where they have attained data over time and the culture within both 55,000 Degrees and Louisville that promotes data transparency and a growing thirst for information. Read this interview to learn why they use the Tableau software platform for their dashboard and which features make it useful for their work.

Goals

IHEP: What were you trying to achieve for 55,000 Degrees by developing this interactive online data dashboard?

We had data, which we wanted to make available for the community to use in an intuitive fashion, and there were more than we could report in a progress report. The data are disaggregated in many ways by demographics, by school, by year, etc. We wanted to make sure different groups and issues could be explored, and to hold schools accountable and sometimes celebrate them. Our partners love the dashboard. This year we linked it through the progress report, so we’re trying to bring more awareness of it to other community leaders and organizations.

IHEP: Could you please describe the overall culture toward reporting data in Louisville?

Initially there was some resistance to the idea of the dashboard from different institutions, but now they’re on board with it. Our city is pretty big on open data. Our mayor is big on putting all of the city data out there about metro government. We have had different conversations about how we can get all of the community data in one place so different people can access them. Most of the data we’re publishing are public to begin with; it’s been easier for journalists around here to write stories when they don’t have to go digging quite as far. I’ve sent links a couple of times to a local newspaper here, and I’ve seen the dashboard link embedded in a post from Public Radio.

Our executive director, Mary Gwen Wheeler, has been so supportive of using data to drive action and using them to align strategies. Our organization has always been all about data— even the name of our initiative is a number. If leadership isn’t interested in data, are they really going to invest the time it takes? Are they going to be patient with the things you need? These are questions communities have to ask themselves.

Partnership

IHEP: How did your partners react to the development of this dashboard as a data tool?

At first, our board of directors and different partners had concerns about this dashboard. They thought an institution would be singled out, or if you could see every single high school by their college- and career-readiness indicators, then it would make some schools look really bad. It would raise awareness of how poorly we were doing in some schools and how well in others. There was a lot of concern initially that there would be this huge community outrage towards some of our partners who were at the table, working hard. But Dr. Dan Ash, who was our director of research and data analysis at 55,000 Degrees, was committed, saying we had to get these data out there and everything had to be transparent. If we looked at the data more, we’d be able to rally more support from people who care about helping these high schools because they’d become aware of the disparities.

While there was fear from some of our education partners, our business community did not back down. They argued we had to make these data public because only then would we have accountability and know where we need to focus efforts. So there was a split among our partners and the board, but we decided to move forward and, ultimately, it has not been used as a tool to point fingers. It has been used to identify solutions and significant areas where we need to focus our efforts.

IHEP: Once you decided you wanted a dashboard, how did you start developing it?

Early on, I [Kennedy] and Dr. Dan Ash primarily brainstormed the dashboard. First we curated our data to see what we had, what we wanted and didn’t have, and what variables we had. We mapped out everything, drawing pictures of everything. Then one of the big questions we had starting out was about drilling down into the data. You could put a million different variables in multiple-choice filters on these visualizations in the dashboard, but the user could be easily confused. We wanted to limit the different ways you could drill down at certain times, so we looked at other sites that had dashboards. Take, for example, The New York Times and The Chronicle of Higher Education. We looked at how they set up their filters and we made decisions about how many schools you could view at one time, how many races, things like that.

IHEP: Who is involved in putting together and reviewing the dashboard?

Every year, I [Kennedy] will put together a draft of the different indicators and show it to Lilly [Massa-McKinley] as it goes along; we both do analyses for the progress report. I’ll show it to the rest of the 55,000 Degrees staff periodically. Mary Gwen Wheeler has a lot of good feedback and asks questions like whether we can show the data in different ways or find more data, and double-checks if they’re right. It’s a very collaborative process. We also send the data once a year to our data committee, made up of higher education data folks and other data people from around Louisville. We ask them to check their institutions’ data and get their feedback. We also get feedback from the board of directors a couple of weeks before we release the data to the public.

At first, our board of directors and different partners had concerns about this dashboard. They thought an institution would be singled out, or...it would make some schools look really bad...We had to get these data out there, and everything had to be transparent. If we looked at data more, we’d be able to rally more support from people who care about helping these high schools because they’d become aware of the disparities.

Implementation

IHEP: Tell us about the software platform you use for your dashboard, Tableau.

The first year that Dan and I [Kennedy] were working with data for the organization, we used some SAP software, Xcelsius, to do the dashboard. It was okay, but limited. I had to get creative with different things to make it do what I wanted, and then export it into Flash, so it wasn’t mobile-friendly. We started exploring other options, but we were primarily drawn to Tableau because of costs. We use Tableau Public, completely free of charge. I started playing around with it and found that it was pretty intuitive to use. There’s a good online community as well with lots of different users, so if you have a question about something, odds are that someone else has already asked and answered it on the message board.

Tableau became a publicly traded company and they continue to update their software very frequently. Every version is better than the last. It’s really great. We put all our data in Microsoft Excel and then use the free version of Tableau. It’s all public data. It should be noted that Tableau Public does not allow you to link with a database, and you can’t hide your data. Neither of those two things affect us, so we use the free version without problems.

IHEP: What sort of software programs do you use along with Tableau?

Tableau has a plug-in for Excel, just called “Tableau plug-in.” By clicking a button, it reformats the data on your spreadsheet to a Tableau-friendly format. I [Kennedy] highly recommend using that; I’ve found it makes things easier. Also, as we’re designing the dashboard, all the visualizations are embedded in a WordPress website, so we can decide how we want to display all the data. It takes a while to make sure all the fonts and colors are consistent. But once your template is set up, the process of incorporating data from new sources each year becomes quicker.

IHEP: When the first dashboard launched in 2011, how long did it take to complete? And how long does it take to complete an update now?

The first one took a really long time, because we were pulling all the data for the first time. A big part of this is setting up your spreadsheets in the right way so that the data are easily visualized. A spreadsheet that looks good to a human is probably going to look terrible to whatever software you’re using, and vice versa. So there’s a bit of a learning curve around that.

Nowadays our main analysis work takes place August through October, ideally. Mike mocks up some work in Tableau. We do a lot of writing. We work with our graphic designers in October and November and then have a release in December. But you have to start sending your data requests far in advance. You can’t get those out too early. In June we may resubmit our data request to some state agencies and then hope to hear back from them by October. Mike makes drafts in WordPress, then spends a little over one week making the dashboard each year. Then you have to build a few days in your timeline to get feedback from everyone. We are really trying to report the freshest data possible every year, but even IPEDS doesn’t tell you when things are going to come out. For example, their net price and enrollment data came out toward the end of November, and we had a December 2nd release date, so we scrambled to get that information in the progress report.

You have to start sending your data requests far in advance. You can’t get those out too early.

IHEP: Where do you find the data that you show on the dashboard?

If you compare the very first report we released to the most recent one, it’s night and day as far as the indicators we use and our analysis of them; that happens through practice and continuous improvement. We have a couple of very important data sources, such as the Census and IPEDS. We also get so much data from Kentucky Higher Education Assistance Authority (KHEAA). They have access to high school data and connect it to college data, as well as the FAFSA and financial aid data. They are also the keepers of the summer melt data, meaning college intenders versus attenders. KHEAA can even provide us with data from our Catholic high schools that don’t participate in the National Student Clearinghouse, so they don’t even know themselves what their college-going rates are.

KHEAA gives us FAFSA data by zip code, FAFSA completion by ACT score range, college-going by ACT range, college-going by zip code. We choose to publish college-going by high school and ACT by high school, which come from the Kentucky Department of Education. We don’t publish FAFSA data on the dashboard, but we use them for our action networks around high school to college transition. Some of the data we get are more for internal use; there’s a lot more to the story than a number would really tell.

The more data you have, the more questions you want to ask, which leads to requesting even more data and asking more questions. The data we have now have been built up over four years of asking questions.

IHEP: What sort of challenges have you encountered with attaining or using data for the dashboard and how have you overcome them?

We want to know if students are being retained by other institutions, not just the ones they started at, so we’re thinking of ordering our own subscription to the National Student Clearinghouse Student Tracker data so we can ask more questions about retention. We also have had issues with the college graduation indicators because IPEDS only reports first-time, full-time students, so we are hoping a National Student Clearinghouse subscription will help with that issue as well.

Definitions from our data sources change over time. The state changed the formula from an average freshman graduate rate to a cohort graduation rate, so the numbers jumped about 10 percent. Another example is that Ivy Tech Community College used to report campus-specific data to IPEDS, but now they only report system-wide data. Ivy Tech South Central Campus is a huge player in Louisville’s attainment efforts, so we have to have their data. We worked with the institutional research department at that campus to get it, but sometimes their definitions and formulas are a little different than what IPEDS is using.

IPEDS has also used different definitions of race over time. There were three different formulas they used for defining “African American” over the past decade. That’s where we worked with our data committee. They’re the ones reporting data to IPEDS so they understand the nuances, and we talk about whether we should put the new data on a different chart, separate from the previous figures, or write a note to explain the change in the dashboard. Sometimes we just can’t get certain indicators anymore and have to stop including them in our report and dashboard. We take what we can get.

Impact

IHEP: How is the dashboard being used?

The dashboard is a living document. The more data you put out, the more questions you have. For example, about a year after we launched the dashboard, the African American community launched an initiative called 15K Degrees, where they said 15,000 of the 55,000 Degrees will be from the African American community. This underscores the importance of race-based data, so we added race for every available indicator. As the African American community is trying to figure out how we are doing as a population, as 55K advances and is trying to figure out where to focus our efforts, it’s become useful to have the education attainment rate available in every conceivable way. We can break it out by age group and gender and so on, see who is really doing well and who isn’t.

We also talk about peer cities a lot too, and how are we doing in Louisville in comparison. That’s always been a big question and I had some time to tackle it at one point last year. I pulled the education attainment rates for our fifteen competitor cities and put it all into Tableau. I sent that around internally to different people, including the Mayor’s office. Enough people liked it and said we should put it in on the dashboard, so we did.

Looking Forward

IHEP: How would you like to see the dashboard used in the future?

We wish that more community organizations, after-school-time providers and parents would use the dashboard. The progress report is the easy-to-understand piece, and we see community organizations and others citing data from our report all the time. They need to use the dashboard to look at the specific populations and schools they serve, otherwise they are only talking about aggregate data. Maybe there is a barrier because they can’t print it out. There is a significant amount of community education work that needs to go along with creating this kind of dashboard so that it can be used in all possible ways. We have focused on making our progress report accessible to the community and maybe next year we need to look more at getting the dashboard in front of more groups, and helping them to be users and consumers of data in a more orderly, intentional way.

Creating an Educational Data Dashboard in Tableau

Additional Resources

Bring on the Data: Two New Data Tools from Strive (2012: StriveTogether)

This brief demonstrates how communities can report data online through the Community Impact Report Card and Student Success Dashboard (SSD) tools. The Community Impact Report Card presents easily understandable indicators to track population-level outcomes and progress toward community goals. The SSD integrates academic and non-academic data across multiple systems to facilitate the tracking of collaborative efforts, supporting continuous improvement, evaluation, and research.

Partnerships for College Readiness (2013: Annenberg Institute for School Reform)

Using collaborations around data in the College Readiness Indicator Systems project as a basis, this report discusses how school districts, postsecondary institutions, and community-based organizations have built partnerships to improve college readiness. It examines the emergence of community-led umbrella organizations involving CBOs, elected leaders, philanthropy, and business. It takes a closer look at such organizations in New York, Boston, and Dallas and examines common challenges and lessons learned for effective partnerships.

Postsecondary Data Resource List (2015: Institute for Higher Education Policy)

IHEP’s Postsecondary Data Collaborative, or PostsecData, has compiled an extensive list of resources that will be of use to anyone interested in accessing or better understanding postsecondary data. This resource list contains dozens of examples of dashboards and documents that communities and initiatives around the country have used to report data.

Using Data to Advance a Postsecondary Systems Change Agenda (2013: OMG Center for Collaborative Learning)

This issue brief shares lessons learned from the Bill and Melinda Gates Foundation’s Community Partnerships portfolio, whose communities developed and implemented multi-sector strategies in place-based initiatives to raise the number of low-income students with a postsecondary degree or credential. Lessons revolve around building relationships and structures to support data use and interpretation; disaggregating data; targeting data; and making use of a wide range of data skills across partners.