Understanding the frameworks screen
The Framework screen is useful for exploring data describing a specific geographic area, or community. This screen allows users to quickly understand how an area performs across a series of data indicators, relative to a benchmark. Data are organized by frameworks, so users get an idea of how an area performs generally for a specific domain or category and can use the arrow icons to drill down into the data to see the list of individual indicators that make up each domain and how that area scores for each indicator. Toggle between frameworks at the top of the screen to explore data organized by different domains and get a fuller picture of the conditions in a given area.
The Framework screen shows values for the selected area, benchmarks for comparison, and a Z-score analysis output for each domain and indicator. The value column shows the score for the geographic area selected, and the benchmark column shows the score for the benchmark selected. The score column shows a “fuel gauge” visualization of how the selected area performs relative to the selected benchmark. IP3 | Assess uses a z-score approach to scoring individual indicators and data across domains—the “fuel gauge” visualizations used throughout the platform depict z-scores relative to the selected benchmark. This approach allows an "apples to apples" comparison of data from a variety of sources and with a variety of units and collection methods.
The fuel gauges show bright red* if an indicator or domain scores significantly worse than the benchmark (i.e., the Z-score value is > 1 standard deviation worse than the benchmark), light red* if the Z-score is worse than but not a full standard deviation different from the benchmark score, light green* if the Z-score is better than, but not a full standard deviation different from the benchmark score), and bright green* if an indicator or domain scores significantly better than the benchmark (i.e., the Z-score value is > 1 standard deviation worse than the benchmark).
In this way, users quickly get a clear idea of how an area performs for a specific data indicator or across a domain compared to a benchmark, simplifying the interpretation of data across the platform. (Learn more about the scoring methodology here.)
*A note on colorblindness: We endeavored to select color shades and saturations that could be differentiated by those with colorblindness. However, we recognize that for many, red and green are difficult to discern. The boxes themselves are another way to interpret the results:
A box fully shaded to the left of the centerline represents a significantly worse score (i.e., greater than 1 standard deviation) than the benchmark score.
When the box to the left of the centerline is partially shaded, it means the z-score is worse than the benchmark but not a full standard deviation.
A box fully shaded to the right of the centerline represents a significantly better score (i.e., greater than 1 standard deviation) than the benchmark score.
When the box to the right of the centerline is partially shaded, it means the z-score is better than the benchmark but not a full standard deviation.