Measuring What Matters — Duncanville Arts Foundation
Amplifying What Matters

Southwest Dallas County has no local arts investment data. We are building it.

National research on arts and culture investment is extensive. Local data, specific to a municipality, grounded in behavioral measurement, and reproducible at the activation level, is nearly absent. In Southwest Dallas County, it does not exist at all.

Duncanville has approximately 40,000 residents, no established arts infrastructure, and an estimated $31 million in annual entertainment expenditure that exits the city in the absence of local programming sufficient to retain it. The Duncanville Arts Foundation was established to generate the first structured evidence of whether that condition is reversible, and under what program conditions.

At the end of 24 months, the evidence points in one of two directions.

24-Month Research Outcomes
Path A
If resident spending patterns support independent operation
Graduated programs transition to permanent commercial placement throughout Duncanville. Years three and four scale what the data confirms works.
Self-sustaining arts economy
Path B
If interest is present but buying patterns cannot sustain independent operation
The Foundation will say so plainly and recommend the gap be addressed through public funding. Demonstrated demand that cannot yet sustain itself commercially is a different policy conversation than a community that has never been measured.
Evidence-based case for public investment

Either outcome advances Duncanville.

Cultural Investment Strategy

The work of the next two years.

Most of Duncanville's cultural investment is invisible to the people who shape its future.

The $31 million figure is a conservative baseline derived from federal household expenditure data applied to Duncanville's population profile. The derivation is shown in full at right. The BLS entertainment category is broader than the CIS-eligible share; live events, dining experiences, and cultural programming represent a subset of that total. The strategy rounds down from the unrounded figure of $33,127,875, applying a 6.4% conservative buffer (BLS CEX 2024; U.S. Census Bureau, ACS 2019–2023; Derived Calculation).

The Cultural Investment Strategy operates on a single governing principle: 100% pre-commitment of projected costs before any activation proceeds. Programs that achieve full validation proceed. Programs that do not achieve full validation do not proceed. Every program that enters the pipeline receives structured development support regardless of validation outcome (Cultural Investment Strategy, Version 2.0, May 2026).

CII scores are comparable across activations only if the conditions under which data is collected are held constant. The Foundation maintains a pre-qualified roster of program managers assessed against a uniform scope of work before engagement. Variation in data collection personnel between measurement occasions — known as instrumentation threat — introduces an alternative explanation for observed differences in outcomes, undermining comparability (Campbell and Stanley, 1963; Shadish, Cook, and Campbell, 2002).

$31M
Projected annual entertainment expenditure for a city of Duncanville's size. Source: BLS Consumer Expenditure Survey, applied to Duncanville population profile.
How the baseline is derived
01
Duncanville households: 13,385 (U.S. Census Bureau, ACS 2019–2023)
02
Income adjustment ratio: 0.685 ($71,381 local median income ÷ $104,207 national average, BLS CEX 2024)
03
Entertainment share of total expenditures: 4.6% (BLS Consumer Expenditure Survey, 2024)
04
Per-household annual entertainment spending: $2,475 ($53,797 income-adjusted × 4.6%)
05
Total pool: $33,127,875 (13,385 households × $2,475). Strategy rounds down to $31M, a conservative buffer of 6.4%.
Investment Pipeline

Five stages. One validation gate.

Every program that enters the Foundation's consideration passes through a five-stage pipeline. Development support is available to every proposer from the moment of intake, regardless of whether the program ultimately achieves validation. The validation gate is the structural control: programs that achieve 100% pre-commitment of projected costs proceed to activation. Programs that do not achieve 100% pre-commitment do not proceed. There are no exceptions (CIS v2.0, Section 5).

01
Stage One
Intake
Proposer submits a concept. Intake is rolling. There are no quarterly deadlines. Foundation acknowledges receipt and assesses concept viability.
02
Stage Two
Develop
Proposer completes five workshops covering audience development, pricing strategy, pre-sales execution, production planning, and financial management. One-on-one consulting is included.
03
Stage Three
Validate
Proposer pursues 100% pre-commitment of projected costs through ticket sales, sponsor commitments, or vendor deposits. Programs that achieve full commitment proceed. Programs that do not, do not.
04
Stage Four
Activate
Validated program launches at Arts Junction. All five CII data streams are collected: ZIP code distribution, substitution survey, adjacent business impact, and repeat attendance.
05
Stage Five
Graduate
Programs demonstrating consistent CII performance across multiple activations become candidates for permanent placement across Duncanville's commercial inventory.
Validation Gate
Programs achieving 100% pre-commitment of projected costs proceed to activation. Programs that do not achieve 100% pre-commitment do not proceed. Failed validation attempts are analyzed and findings are returned to the proposer. Development support continues regardless of outcome.
Operational Base

Arts Junction at 202 West Center Street.

Arts Junction is located within Old Rail Station, a 3.35-acre mixed-use campus at 202 W. Center Street in downtown Duncanville. The campus includes approximately 4,000 square feet of event-capable space with full-service kitchen access, 8,400 square feet of flexible retail and office space, and outdoor patios and gathering areas. Existing tenants include a coffee house, fitness studios, dining, and professional services. All Year One CIS activations occur at Arts Junction. Arts Junction provides a known venue with fixed, predictable costs, which simplifies production planning and financial modeling during the development and validation stages. Whether a program can achieve full pre-commitment against those costs is a central test of the validation gate (CIS v2.0, Section 3.2; LoopNet, 2025).

Schedule a Visit
Address
202 West Center Street, Suite 101
Duncanville, TX 75116
Campus
3.35-acre mixed-use development
Old Rail Station
Event Space
~4,000 sq ft with full-service kitchen
Flex Space
8,400 sq ft retail and office
For Proposers

What participation produces.

The Foundation is asking proposers to meet a high bar: 100% pre-commitment of projected costs before a single activation proceeds. That requirement exists because the Foundation operates without speculative risk. In exchange, proposers receive something that independent arts producers rarely have access to: structured development infrastructure, a controlled venue with no upfront lease obligation, and validated demand data from their own activation (CIS v2.0, Section 5; Appendix A).

Development Support
Five structured workshops covering audience development, pricing strategy, pre-sales execution, production planning, and financial management. Each workshop produces an operational deliverable applied directly to the proposer's concept. One-on-one consulting is included throughout. Support continues regardless of whether validation is achieved.
Venue Access
Arts Junction at Old Rail Station provides approximately 4,000 sq ft of event-capable space with full-service kitchen access, outdoor patios, and adjacency to commercial tenants. The Foundation provides venue access and operational support for every validated activation. Proposers test concepts in a controlled environment before assuming any lease obligation.
Demand Data
Every activation produces a scored CII Worksheet: attendance figures, ZIP code distribution of the audience, substitution survey results, adjacent business lift, and a repeat participation rate. This is documented behavioral evidence of demand. It is the argument a proposer needs when approaching property owners, lenders, or grant funders about permanent placement.
Validation Analysis
Programs that do not achieve 100% pre-commitment do not activate. They do receive a written analysis of why: what the campaign data shows, where commitment stalled, and what the findings indicate about format, pricing, or audience targeting. This is market research that independent producers cannot otherwise obtain.
Graduation Pathway
Programs that achieve consistent CII performance across multiple activations become candidates for permanent placement in Duncanville's commercial inventory through one of three pathways: Arts Junction Residency, Duncanville Placement, or Foundation-supported Independent Establishment. The CII data is the underwriting argument.
Certificate
Pipeline participants who complete all workshops, submit a portfolio at Proficient standard (85+), and achieve a CII score of 50 or above on a minimum of one activation earn the Cultural Activation Producer Certificate. The certificate is a transferable professional credential issued by the Foundation, specifying track, date, and CII score.
Submit a Concept
Rolling Admission
How to Enter the Pipeline
The Foundation accepts program concepts on a rolling basis. There are no quarterly deadlines. Submitting a concept does not commit the proposer or the Foundation to any financial obligation. It begins a development process.
Disciplines
Performance, visual arts, culinary, experimental, and other formats considered.
Venue
All activations take place at Arts Junction at Old Rail Station, 202 W. Center Street, Duncanville. The Foundation provides venue access and operational support.
Requirement
100% pre-commitment of projected costs before any activation proceeds. This requirement is absolute.
Submit
Contact ron@duncanvillearts.org to request the intake form. The form initiates the process. Incomplete answers are acceptable at intake.
Professional Development
Cultural Activation Producer Certificate
A professional credential issued by the Foundation verifying competency across five operational domains. Available through two tracks.
Pipeline Track
Five workshops completed, portfolio at Proficient standard (85+), and CII score of 50 or above on a minimum of one validated activation.
Open Enrollment
Same workshop and portfolio requirements. Activation performance is not required. Signals readiness to enter the pipeline or operate independently within Duncanville's cultural ecosystem.
Measurement System

Four indicators of revenue retention.

The validity of each indicator depends on consistent execution across every activation. The Foundation maintains a pre-qualified roster of program managers operating under a uniform scope of work, ensuring that data collection conditions are held constant and that results are comparable across programs and disciplines.

01: Geographic

ZIP Code Distribution

Purchaser ZIP code captured at point of sale through the ticketing platform. Geographic participation is mapped by activation to determine what share of attendees are Duncanville residents (ZIP codes 75116 and 75137) and at what rates specific neighborhoods participate.

02: Behavioral

Substitution Survey

A standardized post-event survey administered to all attendees asks: "What would you have done tonight if this event did not exist?" Response categories capture whether attendees substituted a local experience for one they would have traveled outside Duncanville to find, or whether the event created new demand entirely.

03: Commercial

Adjacent Business Impact

Old Rail Station tenant transaction volume is measured on activation nights and compared against four baseline nights (same day of week, within the preceding 60 days, excluding holidays and private events). This controls for day-of-week variability and seasonal trends, and produces the Adjacent Business Lift figure for each activation.

04: Longitudinal

Repeat Attendance

Attendance data is cross-referenced across multiple activations of the same program. Repeat attendance distinguishes programs generating temporary interest from programs generating sustained behavioral change in the entertainment habits of Duncanville households.

Cultural Investment Index

How programs are scored.

The Cultural Investment Index produces a composite score on a 0–100 scale by weighting five performance factors. Each factor is scored independently on a 0–100 scale using standardized rubrics, then multiplied by its assigned weight. The weighted scores are summed to produce the CII. The algorithm applies identically across all discipline categories. Financial validation and geographic targeting account for 55% of the composite score, reflecting the strategy’s core purpose: demonstrating that Duncanville residents are choosing local programming over out-of-city alternatives (CIS v2.0, Section 6).

Cultural Investment Index Formula
CII  =  (Pre-Commitment × 0.30)  +  (Resident Share × 0.25)  +  (Substitution × 0.20)  +  (Repeat × 0.15)  +  (Adjacent Lift × 0.10)
Pre-Commitment Achievement
Data: Ticketing platform; sponsor documentation; vendor deposits
30%
Financial validation is the gateway to activation. The highest-weighted factor reflects the Foundation’s core principle: demand must be demonstrated before resources are deployed.
80–100
Strong
100% commitment reached before the original campaign deadline. Multiple revenue sources. No deadline extensions. Commitment velocity accelerated through the campaign.
60–79
Solid
100% commitment reached within one deadline extension (up to 14 days). Revenue concentrated in two or more sources. Steady commitment velocity.
40–59
Developing
100% commitment reached after multiple extensions or significant last-stage effort. Revenue concentrated in a single source. Required substantial Foundation staff intervention.
0–39
Weak
Program did not achieve 100% pre-commitment. Scored only for programs that entered validation but did not activate. Data captured for proposer feedback.
Duncanville Resident Share
Data: ZIP code from substitution survey Q1 and ticketing platform. ZIP codes: 75116, 75137, 75138
25%
The strategy exists to recapture resident entertainment spending. Programs drawing primarily non-resident audiences serve a tourism function, not the retention mission the CIS targets.
80–100
Strong
70% or more of attendees from Duncanville ZIP codes. Strong alignment with the retention mission.
60–79
Solid
50% to 69% from Duncanville ZIP codes. Majority local with meaningful regional attendance.
40–59
Developing
30% to 49% from Duncanville ZIP codes. More regional than local. May indicate tourism-oriented format or insufficient local outreach.
0–39
Weak
Below 30% from Duncanville ZIP codes. Program does not reach the target population at sufficient scale to generate meaningful spending recapture.
Substitution Signal
Data: Substitution survey Q2 responses: “Attended a similar event outside Duncanville” and “Attended a different type of entertainment outside Duncanville”
20%
Resident attendance is necessary. Evidence that those residents would have spent elsewhere is what converts attendance into spending recapture.
80–100
Strong
60% or more of respondents indicate they would have gone elsewhere. Program is directly displacing out-of-city entertainment spending at high rates.
60–79
Solid
45% to 59% would have gone elsewhere. Meaningful substitution with significant redirected spending.
40–59
Developing
30% to 44% would have gone elsewhere. Moderate substitution. A substantial share may represent event creation rather than redirected spending.
0–39
Weak
Below 30% would have gone elsewhere. Program generates attendance but limited evidence of spending redirection.
Repeat Participation
Data: Substitution survey Q5; ticketing platform repeat-purchase data. Scoring begins at the second activation.
15%
Habit formation is the behavioral signal that substitution is becoming a pattern. One-time attendance does not indicate a durable shift in entertainment spending behavior.
80–100
Strong
40% or more of attendees are return visitors (third activation or later). Program is building a habitual audience with strong evidence of durable demand.
60–79
Solid
25% to 39% are return visitors. Meaningful repeat attendance with an expanding base. Habit formation underway.
40–59
Developing
15% to 24% are return visitors. Early repeat signals at the second or third activation. Audience retention requires attention.
0–39
Weak
Below 15% return visitors, or first activation (scored at 50 by default to avoid penalizing new programs). Limited evidence of habit formation.
Adjacent Business Lift
Data: Point-of-sale reports or foot traffic counts from Old Rail Station tenants on activation nights vs. four baseline nights (same day of week, within preceding 60 days)
10%
Secondary economic effects validate that programming generates broader commercial activity. This factor carries the lowest weight because it is influenced by variables outside the proposer’s control.
80–100
Strong
25% or greater lift in adjacent tenant revenue or foot traffic compared to baseline. Activation generates significant secondary economic activity.
60–79
Solid
15% to 24% lift compared to baseline. Measurable commercial benefit to adjacent tenants.
40–59
Developing
5% to 14% lift compared to baseline. Modest impact. Adjacent businesses experience some benefit but activation does not meaningfully change tenant performance.
0–39
Weak
Below 5% lift, no measurable change, or decline in adjacent business activity. Activation does not generate secondary commercial benefit at the venue.
Graduation Thresholds
ScoreClassificationAction
70–100
Graduation Candidate
Prioritized for permanent placement across three pathways: Arts Junction Residency, Duncanville Placement, or Independent Establishment.
50–69
Targeted Development
Program shows potential. Receives targeted support to strengthen weak factors. May reattempt activation.
Below 50
Redesign or Sunset
Fundamental concept reassessment required before pipeline re-entry.
Multi-Activation Scoring

Graduation decisions are based on performance trends, not single scores.

A single high-scoring activation does not qualify a program for graduation. The Foundation evaluates the trend across activations to assess whether performance is consistent, improving, or declining. The maximum number of activations before a graduation or sunset decision is six.

PatternCII ScoresDecision
Consistent High
70+ across three or more consecutive activations
Graduation candidate. Program demonstrates sustained demand and substitution at scale sufficient for permanent placement.
Improving Trajectory
Below 70 initially, trending to 70+ by third or fourth activation
Graduation candidate with monitoring. Performance trajectory justifies placement. First-year review required.
Plateau in Development
50–69 across three or more activations with no upward trend
Redesign recommendation. Program has reached a performance ceiling at its current format. Proposer receives detailed factor analysis and redesign consulting.
Declining
Scores decrease across consecutive activations
Sunset recommendation unless the proposer identifies and addresses the cause of decline within one additional activation cycle.
Consistent Low
Below 50 across two or more activations
Sunset. The program does not generate sufficient demand or substitution to justify continued activation at Arts Junction.
The maximum number of activations before a graduation or sunset decision is six. Programs that have not achieved a CII score of 70 or above by their sixth activation are evaluated for redesign or sunset. This threshold prevents pipeline congestion and ensures Arts Junction capacity is available for new proposers (CIS v2.0, Appendix D, Section 5).
Permanent Placement

Where successful programs land.

Graduation is not a ceremonial conclusion. It is a transition to permanent commercial placement. Programs that demonstrate consistent demand across multiple activations at Arts Junction become candidates for one of three placement pathways. The pathway is determined by program format, audience scale, and the proposer's operational capacity at the time of graduation consideration (CIS v2.0, Section 10).

Pathway 01
Arts Junction Residency
A long-term programming agreement at Arts Junction with a recurring activation schedule. The program retains its venue, its audience, and its operational infrastructure. The Foundation continues data collection and reporting.
Suited for: Programs with demonstrated local audience concentration, consistent CII scores at the Strong or Solid tier, and recurring format (monthly, seasonal, or series).
Pathway 02
Duncanville Placement
Facilitated placement in available commercial property elsewhere in Duncanville. The Foundation uses CII data, specifically audience ZIP code concentration and adjacent business lift, to identify appropriate venues and support the lease negotiation process.
Suited for: Programs that have outgrown Arts Junction capacity or require a format-specific venue. Proposers who are operationally ready to assume lease obligations.
Pathway 03
Independent Establishment
Foundation support for independent venue acquisition or lease negotiation within Duncanville. Validated demand data serves as the core underwriting argument to property owners and lending institutions.
Suited for: Proposers building toward an independent cultural business. CII data provides the demand evidence that speculative lease negotiations typically cannot produce.
CII 70+
Programs with a Cultural Investment Index score of 70 or above across multiple activations are prioritized for graduation. Graduation decisions are based on performance trends across activations, not a single score. The maximum number of activations before a graduation or sunset decision is six (CIS v2.0, Appendix D, Section 5).
The Foundation

Infrastructure precedes programming.

The Duncanville Arts Foundation is a 501(c)(3) nonprofit organization working to build the governance, measurement, and fiscal infrastructure for arts and culture in Duncanville, Texas. The Cultural Investment Strategy is the Foundation's primary work for its first two years, establishing the measurement framework that will determine what sustained cultural investment in Duncanville can realistically support.

Four Operating Functions
01
Fiscal Sponsorship. 501(c)(3) infrastructure for arts projects and emerging organizations before they have independent nonprofit status.
02
Governance Support. Board development frameworks and organizational design resources for cultural organizations at all stages.
03
Strategic Coordination. Alignment between municipal entities, regional institutions, and cultural organizations around shared investment goals.
04
Cultural Measurement. The Cultural Investment Index, which translates cultural activity into the policy language that civic decision-makers and funders use.
Leadership & Governance

Board of Directors

Sarah Macias
Chair
Rhonda Allen
Secretary
Patricia Ebert
Treasurer
Ron Thompson
Founding Executive Director
Ex Officio, Nonvoting Member

Community, business, and cultural leaders providing guidance on arts development initiatives. Advisory Members advance the Foundation's work by offering expertise, fostering partnerships, and supporting the long-term development of cultural infrastructure in Duncanville.

Denise Lee
Founder & Executive Director, Visions for Change and Denise Lee Onstage
Bria Maiden
Supafly Studio
Dr. Anne Perry, PhD
Duncanville Arts Commission (FY 2022–2025)
Tim Perry
Duncanville Arts Commission (FY 2022–2025)
Brad Pritchett
Chief Experience Officer, Dallas Museum of Art · Lifetime Advisory Board, Black Tie Dinner
Robbie Robbins
Painting Prairies: Art & Science · Artist, Texas Master Naturalist
Joanna St. Angelo
Executive Director, Sammons Center for the Arts
Dennis TenWolde
Development Manager, SMU DataArts: The National Center for Arts Research
Samuel Thomas
Co-Founder & Artistic Director, Deep in the Heart Film Festival
Laura K. Wise
Co-Chair, Dallas Museum of Art Junior Associates (FY 2025–2026)
Tiffiney Wyatt, MBA
President & CEO, Corbett Mitchell Media · Duncanville Arts Commission (FY 2022–2025)
Governance Transparency

Resolutions &
Meeting Notices

The Foundation maintains a complete public record of all board resolutions. Each resolution links directly to the governing document. This record reflects the full sequence of organizational actions taken since formation on September 13, 2025.

Request Records

Complete resolution documentation and meeting minutes are available upon request. Contact the Foundation at ron@duncanvillearts.org.

Methods Framework

Measuring What Matters

A Standardizable Behavioral Measurement Instrument for Municipal Cultural Policy Decision-Making

The Cultural Investment Index is not a proprietary scoring system. It is a formal measurement instrument with a published methods framework describing its design principles, validity boundaries, composite indicator construction, and development roadmap. The framework is available for review by municipal partners, academic researchers, and peer organizations.

21
Peer-reviewed citations
11
Sections & appendices
40
Years of critique addressed
Read the Methods Framework →
F1
Pre-Commitment Achievement
30%
F2
Duncanville Resident Share
25%
F3
Self-Reported Substitution Intent
20%
F4
Repeat Participation
15%
F5
Adjacent Business Lift
10%
Composite Formula
CII = (F1 × 0.30) + (F2 × 0.25) + (F3 × 0.20) + (F4 × 0.15) + (F5 × 0.10)
Contact

Connect with
the Foundation

We welcome inquiries from artists, arts organizations, municipal partners, funders, and community members. Use the contact details to reach us directly, or indicate the nature of your inquiry.

Address
202 West Center Street, Suite 101
Duncanville, TX 75116
Data Sources

Bibliography

All quantitative claims on this website are derived from primary federal data sources, official property records, or internal Foundation documents. Each citation below identifies the specific claim it supports, the source, and the access or publication date. Sources were verified February 2026.

  • 01
    U.S. Bureau of Labor Statistics. Consumer Expenditures—2024. News Release USDL-25-1630. Washington, DC: U.S. Department of Labor, September 2025. Table B: Percent Distribution of Total Annual Expenditures by Major Components for All Consumer Units.
    bls.gov/news.release/cesan.htm
    4.6% entertainment share of total expenditures $104,207 national average income before taxes Income adjustment ratio 0.685
  • 02
    U.S. Bureau of Labor Statistics. Consumer Expenditure Surveys: Entertainment Category Definitions and Subcategories. Washington, DC: U.S. Department of Labor. Entertainment subcategories include: fees and admissions; audio and visual equipment and services; pets, toys, hobbies, and playground equipment; and other entertainment supplies, equipment, and services.
    bls.gov/cex · bls.gov/cex/tables.htm
    BLS entertainment category components Category definition as cited on website
  • 03
    U.S. Census Bureau. American Community Survey 5-Year Estimates, 2019–2023: Selected Social, Economic, and Housing Characteristics—Duncanville City, Texas. Tables DP02, DP03, DP04, DP05. Washington, DC: U.S. Department of Commerce, 2024.
    data.census.gov
    13,385 Duncanville households $71,381 median household income ZIP codes 75116 and 75137
  • 04
    Franchise Real Estate Group. 202 W. Center Street, Duncanville, TX 75116—Old Rail Station. Commercial Lease Listing No. 35042319. LoopNet, Inc., 2025. Property description documents 3.35-acre lot, 4,000 sq ft Building 1 with full-service kitchen, and 8,400 sq ft Building 2.
    loopnet.com/Listing/202-W-Center-St-Duncanville-TX/35042319
    3.35-acre campus ~4,000 sq ft event space with full-service kitchen 8,400 sq ft flexible retail and office space
  • 05
    Thompson, Ron. Cultural Investment Strategy, Version 2.0. Duncanville Arts Foundation, May 2026. Internal governance document. Establishes the five-stage pipeline, five-factor CII scoring algorithm, 100% pre-commitment threshold, graduation scoring thresholds (70+, 50–69, below 50), and Year One program count of 16 activations across 5 disciplines.
    CII five-factor scoring weights Graduation threshold: 70+ 100% pre-commitment requirement 16 activations, 5 disciplines, Year One Five-stage pipeline model 55% weight on financial validation and geographic targeting
  • 06
    Derived Calculation: Duncanville Entertainment Spending Baseline. Applied methodology: ACS 2019–2023 household count (13,385) × income-adjusted per-household entertainment expenditure. Income adjustment: $71,381 (local) ÷ $104,207 (national, BLS CEX 2024) = 0.685 ratio. Adjusted income: $104,207 × 0.685 = $71,382. Entertainment share: $71,382 × 4.6% = $3,284 per household (unadjusted). Corrected per-household figure using ratio applied to national per-household total expenditures: $78,535 × 0.685 = $53,796; $53,796 × 4.6% = $2,475. Total pool: 13,385 × $2,475 = $33,127,875. Strategy rounds to $31M (6.4% conservative buffer).
    $31M entertainment spending pool $33,127,875 unrounded total 6.4% conservative buffer $2,475 per-household annual entertainment spending
  • 07
    Campbell, Donald T., and Julian C. Stanley. Experimental and Quasi-Experimental Designs for Research. Chicago: Rand McNally, 1963. Foundational taxonomy of threats to internal validity in field research settings. Defines instrumentation threat as changes in observers, scorers, or measurement instruments that produce changes in outcomes independent of the variable under study.
    Instrumentation threat to internal validity Execution consistency as methodology decision
  • 08
    Shadish, William R., Thomas D. Cook, and Donald T. Campbell. Experimental and Quasi-Experimental Designs for Generalized Causal Inference. Boston: Houghton Mifflin, 2002. Authoritative update to the Campbell and Stanley framework. Expands the instrumentation threat definition and its application to quasi-experimental designs, including time-series and multi-site field research of the type the CIS employs.
    Instrumentation threat to internal validity Execution consistency as methodology decision
Duncanville Arts Foundation
Methods Framework
↓ Scroll to Research
Methods Framework  |  Working Paper  |  February 2026

Measuring What Matters

A Standardizable Behavioral Measurement Instrument for Municipal Cultural Policy Decision-Making


Author Ron Thompson
Role Founding Executive Director
Organization Duncanville Arts Foundation
CIS Version 2.0 — Effective May 1, 2026

This paper introduces and formalizes a municipal-scale measurement instrument designed to generate structured behavioral data for local cultural policy decision-making. The instrument standardizes observable activation-level indicators within defined validity boundaries.

The Cultural Investment Index described herein is a management instrument under active development. The validity and reliability work required to advance it toward research instrument status is described explicitly in Section VIII. Claims throughout this paper are calibrated to the instrument's current development stage.

Abstract

A sustained methodological critique spanning nearly four decades has established that the aggregate economic impact study is structurally ill-suited to answer the question municipal decision-makers most need answered: does this programming, in this community, retain spending that would otherwise leave, or does it redirect spending that would have remained locally regardless? Seaman (1987), Crompton (1995, 2006), and Sterngold (2004) identify the omission of substitution effects as the central structural error. McCarthy et al. (2004) identify the conflation of correlation with causation in claims about arts benefits. National studies have not resolved these critiques across successive iterations.

This paper introduces the Cultural Investment Strategy (CIS) and its Cultural Investment Index (CII) as a proposed institutional response to this measurement gap. The CIS is a municipal activation-level instrumentation system with a standardizable data collection framework that generates behavioral records at the activation level, within defined geographic and operational boundaries, using consistent measurement procedures across all programs.

Three scope conditions govern all claims in this paper. First, the CII is a management instrument in active development; the psychometric validation roadmap is described in Section VIII. Second, Factor 3 captures self-reported counterfactual behavioral intent collected at point of activation through survey response. Third, the adjacent business lift comparison is a matched-night analysis with a limited baseline. These scope conditions define the instrument's current claims boundary. Findings from Year One activation records will inform instrumentation refinement and calibration. Causal findings require a dataset of sufficient depth; the instrument is in its first operational year.

Keywords: municipal arts instrumentation, composite indicator design, cultural policy measurement, behavioral data architecture, demand validation, activation-level evidence, methods framework

Introduction: The Municipal Measurement Gap

Municipal governments invest in arts and cultural programming through direct appropriation, facility provision, and economic development strategy. These investments are typically justified through one of two mechanisms: aggregate economic impact studies, which estimate total economic activity attributable to the nonprofit arts sector at regional or national scale; or output reporting, which documents the number of events, attendees, or programming hours delivered.

Neither mechanism answers the question most relevant to municipal policy: does this specific programming, in this specific community, generate economic activity that would not otherwise occur? Aggregate studies operate at scales too broad to isolate municipal effects. Output reporting documents activity without measuring consequence. The methodological critique of aggregate economic impact methodology, advanced over four decades by Seaman (1987, 2011), Crompton (1995, 2006), Sterngold (2004), and others, has established that the dominant approach systematically overclaims through the omission of substitution effects, the misapplication of multiplier coefficients, and the conflation of correlation with causation.

This paper is an institutional response to that measurement gap, complementary to the national evidence base that serves important purposes at the scale for which it was designed. The missing layer is a standardizable, replicable instrumentation architecture that generates activation-level behavioral records at the municipal scale, within defined geographic and operational boundaries.

The Cultural Investment Strategy and Cultural Investment Index described here are proposed as a contribution to that lower layer. The paper describes the instrument's design principles, measurement components, validity boundaries, governance translation protocols, and development roadmap. Year One records will inform instrumentation refinement and calibration. Causal findings require a dataset of sufficient depth; the instrument is in its first operational year.

The Existing Evidence Base: National Breadth, Local Absence

National Research Frameworks

The dominant tradition of arts impact measurement in the United States is the economic impact study. Beginning with the Johns Hopkins University model developed by Cwi and Lyall (1977), arts organizations and their advocacy partners have used input-output analysis and audience expenditure surveys to estimate the broader economic contributions of cultural programming. Americans for the Arts has institutionalized this approach through its Arts and Economic Prosperity series, which in its sixth iteration surveyed 373 communities across all 50 states and Puerto Rico, documenting $151.7 billion in economic activity by nonprofit arts and culture organizations and their audiences in 2022 (Americans for the Arts, 2023). This figure measures reported economic activity associated with the sector, as documented through audience expenditure surveys.

Parallel to the economic impact tradition, RAND Corporation research has developed a framework centered on the intrinsic and instrumental benefits of arts engagement. McCarthy et al. (2004) proposed a two-axis framework distinguishing private from public benefits and intrinsic from instrumental benefits, designed to reorient arts policy advocacy toward a more comprehensive account of how the arts create value. The Urban Institute has contributed a community indicators approach incorporating arts and culture measures into broader quality-of-life frameworks (Urban Institute, 2025). The Wallace Foundation has funded audience development research on the conditions under which sustained participation develops (Wallace Foundation, 2010).

The Local Data Gap

Despite the breadth of national research, a pronounced gap exists at the municipal scale. Bloomberg Associates documented this gap in a 2021 analysis of arts data practices in fifteen cities: there is a recognized lack of standardized, locally specific data, and both the creative sector and municipal governments frequently express skepticism about whether such data can be reliably generated (Bloomberg Associates, 2021). Grantmakers in the Arts found that local arts agencies operate without a standard data collection process nationally (Grantmakers in the Arts, 2020).

Walters and Chandler (2019), in a framework paper published in Local Government Studies, identify the methodological dimension of this gap: prevailing municipal measurement frameworks take an outputs-based approach, evaluating the performance of one-off events through simple metrics, without measuring behavioral outcomes over the life of a strategy. This distinction between output measurement and outcome measurement is foundational to the instrument described in this paper.

Duncanville, Texas illustrates the local data gap. The municipality's approximately 40,000 residents generate a synthetic annual household entertainment expenditure estimate of approximately $31 million, derived from Bureau of Labor Statistics Consumer Expenditure Survey averages applied to American Community Survey household counts (BLS, 2024; U.S. Census Bureau, 2023). The figure is methodologically bounded: BLS entertainment category definitions are broader than arts and cultural spending, and the calculation applies national averages to local household counts rather than measuring observed local flows. Municipal entertainment capacity and cultural spending capacity are distinct concepts. The figure serves to contextualize the scale of the instrumentation opportunity, with those boundaries explicitly acknowledged.

The Methodological Critique: Substitution, Correlation, and the Limits of Aggregate Analysis

The Substitution Problem

The critique of arts and events economic impact studies has a substantial lineage. Seaman (1987) characterized such studies as a "fashionable excess," identifying three fundamental demand-based analytical errors: failure to subtract local sources of spending and non-local uses of spending, erroneous attribution of all ancillary spending as causally related to the subject organization, and failure to adapt multipliers to specific regions. Seaman (2011) identifies these errors as still prevalent in contemporary economic impact methodology.

Crompton (1995) formalized eleven sources of misapplication in economic impact analyses for events and facilities. The most consequential for arts measurement are: the inclusion of local spectators whose spending represents reallocation rather than injection; the failure to exclude time-switchers and casuals whose spending would have occurred regardless; the omission of opportunity costs; and the confusion of total with marginal benefits. Crompton (2006) characterized the incentive structure producing these errors: most economic impact studies are commissioned to legitimize a predetermined position rather than to search for economic truth.

Sterngold (2004) applied this critique specifically to arts economic impact methodology in the Journal of Arts Management, Law, and Society. His central argument is that conventional analyses treat all arts-related spending as additive without accounting for substitution: spending that would have occurred regardless of the arts programming, directed toward other leisure or recreational activities.

Economic impact analyses that use only gross measures of impact fail to provide any evidence to support their claims because the studies overlook the substitution effects of spending by nonprofit arts and cultural organizations.

Sterngold, 2004, p. 169

The Americans for the Arts AEP series has not substantively modified its methodology in response to these critiques across six iterations.

The Correlation Problem

McCarthy et al. (2004) in Gifts of the Muse review the literature on instrumental benefits and find that many studies demonstrate correlation rather than causality. Observed associations between arts participation and educational, economic, or social outcomes frequently fail to rule out alternative explanations: higher household income, greater parental involvement, better-funded school districts. Relying exclusively on instrumental benefit arguments carries a further risk. If a programmatic investment is justified on the promise of improving urban economic outcomes, but non-arts interventions prove more cost-effective at achieving those same objectives, the arts face defunding pressure on the terms of their own advocacy.

Implications for Instrument Design

Together, these critiques define what a credible municipal-scale instrument must accomplish. It must collect behavioral data on what attendees would have done in the absence of programming, directly addressing the substitution question. It must document geographic origin of participants to establish whether any spending retention claim is plausible. It must generate activation-level records that accumulate into a longitudinal dataset suitable for trajectory analysis. And it must operate through a standardized protocol that makes records from different activations methodologically comparable. These four requirements are the design criteria for the CIS.

Instrument Design Criteria

1. Behavioral Origin

Evidence must be grounded in what people actually did or reported intending. The instrument collects two categories of behavioral data: pre-commitment records reflecting financial acts before activation (ticket purchases, sponsor deposits), and post-activation survey responses capturing self-reported counterfactual behavioral intent. Pre-commitment is observed behavior. Survey responses are self-report.

2. Geographic Specificity

Spending retention is a municipal concept. Spending retention evidence requires a resident-majority audience. Geographic origin data establishes whether any retention claim is plausible. For Duncanville, the target geography is defined by ZIP codes 75116, 75137, and 75138.

3. Activation-Level Granularity

Aggregate studies operate at scales that obscure municipal-level variation across programs, geographies, and conditions. The CIS generates records at the level of individual program activations, with consistent data fields across all activations, enabling trajectory analysis and cross-program comparison. This is the outcomes-based approach Walters and Chandler (2019) identify as the necessary alternative to output measurement.

4. Methodological Standardization

For activation-level records to be comparable, data collection must be consistent. If measurement quality varies by individual data collector, or if consistent procedures across activations are the prerequisite for placing records in a shared dataset. Standardization of timing, method, and procedure across all activations is a prerequisite for any subsequent comparative analysis. This is an instrumentation problem in the quasi-experimental sense used by Campbell and Stanley (1963).

The Cultural Investment Strategy as a Municipal Measurement Instrument

Instrument Identity

The Cultural Investment Strategy is a data collection architecture. Its organizing logic is demand validation: programs do not proceed to public activation without demonstrated prior commitment from the intended audience. This logic serves two functions simultaneously. As operational risk management, it prevents investment in programming without evidence of demand. As an instrumentation choice, it generates pre-activation behavioral data that few arts measurement frameworks collect.

The three-stage model — Validate, Activate, Determine — is a decision architecture. Each stage generates structured data. The Validate stage records demand characteristics before any programming expenditure is committed. The Activate stage records attendance, participation patterns, and adjacent economic activity during programming. The Determine stage produces a composite score and a governance determination based on accumulated data. These stages are data phases generating structured records.

The Pre-Commitment Threshold

No program proceeds to Activate without achieving 100% pre-commitment of projected costs through ticket sales, sponsor commitments, or vendor deposits. This threshold functions as a revealed preference gate: participants who purchase tickets before an event exists are demonstrating demand through a financially consequential act grounded in revealed preference prior to activation.

Scope Condition: Endogenous Sample

The 100% pre-commitment requirement defines a specific population of activations. Programs measured by the CIS are those that generate advance financial commitment from a sufficient audience within the timeframe of a validation campaign. This creates an endogenous sample by design.

Several implications follow. First, the CIS measures demand-validated activations specifically. This pipeline is designed for programming that generates advance financial commitment within a validation campaign window. Second, Validate failures constitute a distinct analytic category with their own tracked data fields: genre, demographic target, price point, campaign duration, pre-commitment percentage achieved, and reason for non-advancement. Third, findings from CIS activations apply to the class of demand-validated activations that passed this gate. The scope condition must be stated explicitly in any reporting that uses CIS data.

Controlled Venue Operation

Year One programming operates exclusively at Arts Junction at Old Rail Station in Duncanville. Holding the venue constant across activations reduces environmental confounds: variation in CII scores across activations is more attributable to program-specific factors than to differences in venue accessibility, parking, or neighborhood conditions. The adjacent business lift comparison, which requires a stable baseline business environment, is only feasible with consistent venue operation.

Composite Indicator Design: The Cultural Investment Index

Overview

The Cultural Investment Index is a weighted additive composite scoring instrument that produces a single numeric score for each activation on a scale of 0 to 100. It is constructed from five factors, each scored on a four-tier rubric with defined criteria. The OECD/JRC Handbook on Constructing Composite Indicators (Nardo et al., 2008) provides the methodological standard against which the CII's design choices and required development work are assessed throughout this paper.

CII = (F1 × 0.30) + (F2 × 0.25) + (F3 × 0.20) + (F4 × 0.15) + (F5 × 0.10)
F1Weight: 0.30 (governance)
Pre-Commitment Achievement
Ticketing platform records, sponsor documentation, vendor deposits
F2Weight: 0.25 (governance)
Duncanville Resident Share
ZIP codes 75116, 75137, 75138 from survey Q1 and ticketing records
F3Weight: 0.20 (governance)
Self-Reported Substitution Intent
Survey Q2 and Q3: counterfactual behavioral intent at point of activation
F4Weight: 0.15 (governance)
Repeat Participation
Survey Q5 and ticketing platform repeat-purchase data
F5Weight: 0.10 (governance)
Adjacent Business Lift (contextual)
Tenant point-of-sale vs. matched-night baseline
Weighting Rationale: Governance Parameters

The five weights are governance parameters established through the Foundation's judgment about the relative evidential strength of each factor. Factor 1 carries the greatest weight because pre-commitment is an observed behavioral act, the strongest evidence type in this instrument. Factor 3 carries a reduced weight relative to Factor 1 because survey-reported counterfactual intent is weaker evidence than observed financial commitment. Factor 5 carries the lowest weight because the matched-night comparison is a contextual signal with restricted inferential scope.

The weights are subject to a pre-specified sensitivity protocol described in Section VIII. Pre-specification of this protocol, before Year One data is analyzed, protects against post-hoc weight adjustment in response to data. If sensitivity analysis reveals that program rank-orders are unstable under plausible alternative weight sets, determinations from that period will be reported as provisional pending weight revision.

Additive Aggregation and Compensability

Additive weighted aggregation allows strong performance on one factor to offset weak performance on another. This is a deliberate design choice for an operational management instrument: a program with unusually high pre-commitment but modest repeat attendance is different from a program with low pre-commitment and high repeat attendance, but both may achieve the same composite score. The CII produces a single decision-relevant number for governance purposes. For research purposes, factor-level scores must be reported alongside the composite to preserve analytical granularity. Compensatory aggregation is a known limitation of additive composite indices; Nardo et al. (2008) discuss this explicitly. The choice is accepted here on operational grounds and flagged as a philosophical assumption requiring defense.

Measurement Components

F1: Pre-Commitment Achievement

Pre-commitment measures revealed advance demand within bounded venue conditions. A resident who purchases a ticket before an event exists is performing a financially consequential act grounded in revealed preference prior to activation. F1 carries the highest weight in the CII because it is the strongest evidence type available. Two scope qualifications govern its interpretation. First, it measures demand for the bounded class of demand-validated activations specifically. Second, advance purchase rates vary by genre, price point, and marketing reach, factors the CII tracks without current normalization across activations. Cross-program comparison of F1 scores requires attention to these structural differences.

F2: Duncanville Resident Share

F2 addresses geographic specificity through ZIP code data collected via survey Q1 and ticketing records. Spending retention evidence requires a resident-majority audience; F2 identifies whether that condition is met. The target ZIP codes (75116, 75137, 75138) are an administrative proxy for municipal residency. ZIP code boundaries approximate municipal boundaries and diverge at the margins. Any research application of F2 data should document this approximation.

F3: Self-Reported Counterfactual Behavioral Intent

F3 asks attendees what they would have done with their time and money if this activation had not been available. Survey questions Q2 and Q3 present response options including staying home in Duncanville, spending locally at other businesses, traveling to a regional venue outside Duncanville, and not spending entertainment funds at all. Responses indicating out-of-city alternatives constitute self-reported substitution intent.

F3: Evidence Boundary

F3 captures self-reported counterfactual behavioral intent, collected at point of activation through survey response. It is subject to four known measurement vulnerabilities: (a) social desirability bias, where respondents over-report out-of-city alternatives to appear culturally engaged or to support the Foundation; (b) demand characteristics, where the survey context telegraphs an expected answer; (c) recall confabulation, where respondents rationalize decisions into coherent narratives after the fact; and (d) response-rate selection bias, where highly engaged supporters are more likely to respond.

Partial mitigations within the current design include neutral question wording, response options that normalize local alternatives, and confidence-level thresholds that exclude activation records with sub-20% survey response rates. These mitigations reduce the identified vulnerabilities; full elimination requires additional instrument development beyond the current design.

F3 requires corroboration from F5 before any retention claim is advanced in external communications.

F4: Repeat Participation

F4 serves as a habit formation proxy. Single-activation attendance may reflect novelty-seeking; repeat attendance indicates the formation of sustained local arts consumption patterns, which McCarthy et al. (2004) identify as a prerequisite for both instrumental and intrinsic benefit realization. A default score of 50 is applied to first activations where no prior participation data exists. This default is a governance parameter that compresses variance in early-activation comparisons. Any longitudinal analysis of CII trajectories that includes first activations must flag the default as an artifact. The six-activation maximum creates administrative censoring of program trajectories; programs that sunset before reaching the maximum generate truncated records that must be treated analytically as left-censored observations.

F5: Adjacent Business Lift (Contextual Operational Indicator)

F5 uses point-of-sale data from Arts Junction tenants compared against a matched-night baseline from non-activation nights at the same venue. Matching criteria: same day of week, same seasonality window (within four weeks), excluding holidays and documented competing events. F5 is a contextual operational indicator providing corroborating signal. Its inferential scope is corroboration only.

The matched-night comparison controls for day of week and seasonality. Concurrent events, weather extremes, promotional activity by tenants, and pay cycle effects are tracked in a confound log rather than controlled statistically. Its evidential role is corroboration: when F5 and F3 point in consistent directions, the combined signal warrants greater confidence than either alone. When they diverge, the discrepancy should be investigated before any retention claim is made. The weight of 0.10 reflects this restricted epistemic role.

Threats to Validity, Reliability Protocol, and Required Development Work

A measurement instrument is evaluated by the threats its design creates and the mitigations it provides. This section follows the framework of Campbell and Stanley (1963) and Shadish, Cook, and Campbell (2002) in analyzing threats systematically, and references Nardo et al. (2008) on composite indicator validation standards.

Internal Validity Threats
  • Selection bias through pre-commitment gating. The 100% pre-commitment requirement systematically includes programs that generate advance demand and excludes programs that do not. CIS data describes a specific population of activations; causal inference applies only to that population.
  • Response bias in F3. Self-report survey data in a civic-nonprofit context is systematically susceptible to social desirability and demand characteristics effects. F3 scores should be treated as upper-bound estimates of substitution intent.
  • Administrative truncation. The six-activation maximum creates survivor bias in trajectory analysis. Programs that exit before the maximum generate incomplete records that must account for this administratively imposed censoring.
  • Confounds in F5. The matched-night comparison controls for day of week and seasonality but tracks other confounds through a confound log rather than controlling statistically.
External Validity Limits
  • Venue dependence. All Year One activations occur at Arts Junction. Findings are specific to that venue configuration, neighborhood, and market context. Replication at other venues requires independent validation.
  • Scale constraints. CIS is designed for a municipality of approximately 40,000 residents with no existing arts infrastructure. Application in larger or more arts-saturated municipalities requires design modification.
  • Market-type dependence. The instrument measures demand for programming that can generate advance ticket sales within a validation campaign window. Programming requiring post-activation audience development is outside the measurement boundary.
Reliability Protocol: Required Development Work
Inter-Rater Reliability

The CII requires program managers to apply rubric criteria to observed data and assign factor scores. Before CII records from different activations are placed in a shared dataset for comparative analysis, a reliability study is required. Protocol: two independent scorers assess the same activation records and calculate inter-rater agreement using Cohen's kappa. Target threshold: kappa above 0.70. A kappa below 0.60 triggers rubric revision. Training materials and anchor examples will be developed from the first cohort of activations.

Weight Sensitivity Analysis

The pre-specified sensitivity protocol to be executed before Year One data is published includes three alternative weight configurations representing plausible governance priorities, rank-order stability assessed using Kendall's tau-b, and a reporting rule that any activation receiving a different determination under an alternative weight set is flagged as a borderline case requiring additional evidence before the determination is applied.

ConfigurationF1F2F3F4F5Emphasis
Baseline (CIS v2.0)0.300.250.200.150.10Pre-commitment
Alternative A0.250.200.300.150.10Substitution intent
Alternative B0.250.350.200.100.10Geographic specificity
Alternative C0.200.200.200.200.20Equal weighting
Construct Validity Assessment

As multiple activations accumulate, factor scores will be examined for convergent and discriminant validity using standard correlation analysis. Factors proposed to measure the same underlying construct should show positive correlation; factors measuring distinct constructs should show lower correlation. If factor scores correlate near perfectly, the composite adds no information over any single factor. If they are uncorrelated, the aggregation rationale requires reexamination.

External Calibration

The graduation threshold of 70 will be calibrated against post-graduation commercial performance data once three or more programs have graduated and operated commercially for at least two activation cycles. The threshold is currently a governance parameter; calibration will determine whether it predicts commercial sustainability at an acceptable rate.

Governance Translation: Thresholds and Determinations

The following thresholds and determinations are governance interpretations of structured indicator data. They are governance interpretations translating composite scores into operational decisions, using boundaries set by the Foundation as an exercise of institutional judgment.

CII Score Range
Determination
Governance Meaning
70 — 100
Graduation Candidate
Program is evaluated for transition to permanent commercial placement. Subject to weight sensitivity review before determination is applied.
50 — 69
Development Support
Program continues with additional structured activation cycles. Factor-level analysis identifies specific areas for improvement.
Below 50
Redesign or Sunset
Program enters redesign or sunset review. Campaign findings are documented as evidence of the conditions under which this activation type reached the boundary of demand validation.

Two Potential Research Outcomes

The CIS is designed to generate structured evidence regardless of the direction that evidence takes. This section describes the two potential research outcomes and the policy inferences each supports. Both outcomes are framed as findings the instrument can generate.

Path A
Demand Supports Commercial Sustainability

If programs consistently achieve CII scores at or above 70 across multiple disciplines and activation cycles, and if self-reported substitution data (F3) indicates a meaningful proportion of attendees who reported they would otherwise have spent outside Duncanville, with corroboration from F5, the evidence supports the proposition that structured demand-validated programming of this type can operate sustainably in Duncanville's market. Path A findings would constitute evidence about the conditions under which arts programming achieves commercial independence in a municipality with no prior arts infrastructure.

Path B
Demand is Verified, Market Does Not Clear

If programs demonstrate consistent audience interest, successful pre-commitment validation, and resident participation, but cannot generate sufficient revenue at market-clearing prices to sustain independent operation, the instrument generates a different type of finding. When sustained demand exists alongside commercial pricing shortfall, the following hypotheses should be evaluated before any policy conclusion is drawn.

Path B: Competing Explanations for Market Shortfall

(a) Positive externalities not captured in ticket prices. The social value of local arts programming may exceed the private willingness to pay, a condition associated with merit goods in welfare economics. If this is the operative condition, public subsidy bridges the gap between private and social value.

(b) Coordination failure. Individual residents may value local programming but face collective action problems in organizing demand at a sufficient scale for commercial viability. Subsidy or institutional aggregation may resolve this.

(c) Risk concentration. Arts programming may carry startup risk that private operators are unwilling to absorb without subsidy, even where long-run demand is sufficient. Public subsidy may serve a risk-absorption function rather than a permanent value gap.

(d) Structural product-market misfit. The programming may not be well-matched to the price sensitivity, genre preferences, or scheduling constraints of the local market. The appropriate response would be program redesign.

(e) Awareness gaps. Demand may exist in latent form because potential attendees have limited awareness of available programming. The appropriate response is investment in audience development infrastructure.

CIS data narrows the set of plausible explanations by comparing F1 scores (pre-commitment threshold achievement), F2 and F3 distributions (geographic and counterfactual patterns), and campaign data (pre-commitment campaign completion rates and stage of dropout). Distinguishing among explanations (a) through (e) requires supplementary data beyond the current instrument scope. Path B findings establish the conditions under which further policy inquiry is warranted.

Either outcome constitutes a research contribution. Path A generates evidence of the conditions under which arts programming becomes self-sustaining in a municipality with no prior infrastructure. Path B, disaggregated through competing explanations analysis, generates evidence of the conditions under which demonstrated community demand encounters commercial pricing barriers. Neither finding exists for a municipality of Duncanville's profile in the current literature because the activation-level behavioral data to support it has not been collected.

Conclusion

The measurement problem in municipal arts investment is a framework mismatch: existing tools were designed for national-scale advocacy and generate claims at a scope the underlying data supports only at that scale. The CIS is a complementary instrumentation layer, operating at the activation level, within a single municipality, under defined validity boundaries.

The Cultural Investment Index is a management instrument pursuing the status of a research instrument. The gap between those two descriptions is closed by the development work described in Section VIII: inter-rater reliability testing, weight sensitivity analysis under pre-specified alternative configurations, factor-level validity assessment as activation data accumulates, and external calibration of graduation thresholds against post-graduation commercial performance. None of this work requires the instrument to be redesigned. It requires the instrument to be operated with discipline and the data it produces to be reported at the level of confidence it actually supports.

This paper's contribution is the instrument design, the validity framework, and the explicit separation of governance decisions from research claims. The empirical contribution comes later, when the dataset exists.

The Foundation welcomes methodological review, inter-rater reliability partnership proposals, and positioning within national arts research infrastructure. Correspondence regarding the instrument design or the research roadmap may be directed to the Foundation at 202 W. Center Street, Suite 101, Duncanville, Texas 75116 or [email protected].

Appendices

CIS Evidence Scope and Boundaries

This appendix defines the evidence boundaries of the Cultural Investment Strategy and Cultural Investment Index for Year One data. Each boundary reflects the instrument's current design scope and is reassessed as the validation roadmap in Section VIII progresses.

Evidence Boundaries: Year One Data

1. Spending retention signal. CIS measures self-reported substitution intent at the activation level. CII scores reflect demand quality indicators within a bounded venue and geographic scope.

2. Substitution intent signal. A high F3 score establishes that a proportion of survey respondents reported they would have spent outside Duncanville. F3 data is survey-reported; corroboration from F5 and repeat F2 patterns strengthens the signal.

3. Resident participation scope. CIS measures resident participation as its primary indicator. Out-of-municipality attendance is tracked as secondary data.

4. Tourism generation. CIS collects no data that would support tourism generation claims.

5. Systemwide ecosystem scope. CIS generates activation-level records for the class of demand-validated activations processed through its pipeline. Sector-wide measurement requires additional frameworks.

6. Instrumental benefit scope. CIS makes no claims about educational, health, or social outcomes beyond the spending retention question.

7. Validate failure scope. A program that does not achieve 100% pre-commitment has reached the boundary of this instrument's measurement design. Validate failure data is tracked separately and analyzed for patterns in genre, price point, and campaign duration.

Crompton's Eleven Sources of Misapplication: CIS Response Mapping

The table below maps Crompton's (1995, 2006) eleven sources of misapplication against the CIS design. Where the CIS addresses a source, the response is described. Where it does not, this is stated.

SourceTheoretical DescriptionCIS Response
Using Sales MultipliersFavoring gross sales over household income multipliers inflates reported figures.CIS uses behavioral pre-commitment and substitution data as its measurement base.
Misrepresenting EmploymentTreating temporary event jobs as full-time equivalents creates false job creation narratives.CIS scope is limited to resident spending retention.
Incremental vs. Normal CoefficientsUsing inappropriate marginal coefficients distorts capital flow estimates.CIS uses activation-level behavioral records as its analytical unit.
Defining Impacted AreaOverly broad geographic boundaries obscure the municipal return on investment.F2 pins geographic scope to three specific ZIP codes. F5 is limited to Arts Junction tenants.
Including Local SpectatorsCounting resident spending as economic injection conflates reallocation with injection.F3 asks what attendees would have done otherwise, distinguishing potential retention from local reallocation.
Excluding Time-SwitchersFailing to identify visitors who changed trip timing falsely attributes spending to the event.Pre-commitment gating documents demand before events exist. Survey Q1 captures participant origin. The instrument establishes geographic capture data; causal attribution requires additional analysis.
Excluding CasualsVisitors already in the area for other reasons generate spending attributed to the event.F2 and F3 capture origin and counterfactual intent data. Causal attribution remains outside the instrument's current scope.
Fudged MultipliersArbitrarily inflating coefficients renders studies analytically invalid.CIS uses no multipliers. All factors are derived from observed or reported data with stated limitations.
Total vs. Marginal BenefitsClaiming total activity rather than marginal benefit masks the true added value of the subsidy.Each CII score is specific to one activation. No aggregate sector impact is claimed.
Turnover vs. MultiplierConfusing money velocity with multiplier effects produces artificially large numbers.CIS reports activation-specific scores without spending velocity aggregation.
Omitting Opportunity CostsMeasuring only benefits while ignoring what public funds could have generated elsewhere.The Research Fork framework (Section X) explicitly frames both Path A and Path B as findings. A full opportunity cost analysis is beyond the current instrument scope.

OECD/JRC Composite Indicator Construction: Ten-Step Compliance Assessment

The OECD/JRC Handbook (Nardo et al., 2008) prescribes a ten-step methodology for composite indicator construction. The table below maps CII development status against each step.

#PhaseDescriptionCII Status
1Define FrameworkClearly define objectives and structure dimensions.Completed. CIS defines objective as measuring municipal demand retention at the activation level.
2Select IndicatorsChoose indicators based on relevance, availability, and credibility.Completed. Five factors are theoretically aligned with the spending retention objective.
3Treat DataIdentify outliers and develop missing data rules.Partial. Confidence level thresholds address low survey response rates. Formal outlier treatment rules and missingness protocol require documentation.
4NormalizationBring all indicators onto a common scale.Completed. Four-tier rubric translates raw data into 0–100 scale consistently across all factors.
5WeightingSelect weighting methods with justification.Active vulnerability. Weights are governance parameters. Pre-specified rationale documented in this paper. Robustness testing pre-specified for Year One data.
6AggregationDetermine compensability and aggregation method.Completed. Additive weighted arithmetic aggregation. Compensatory properties accepted and documented in Section VI.
7CoherenceAssess correlations between indicators and dimensions.Required action. Convergent and discriminant validity testing mandated in Section VIII once sufficient activations are scored.
8SensitivityAssess impact of methodological uncertainties on rankings.Required action. Pre-specified sensitivity protocol with three alternative weight sets documented in Section VIII.
9Make SenseDecompose performance to reveal narrative strengths.Operationalized. Graduate, Develop, and Sunset determinations provide activation-level narrative output. Factor-level decomposition tracked in CII Data System.
10VisualizationPresent data clearly without obscuring vital information.Pending. CII Data System dashboard under development. Factor-level reporting will accompany aggregate scores in all published records.

References

  • Americans for the Arts. (2017). Arts and economic prosperity 5: The economic impact of nonprofit arts and culture organizations and their audiences. Washington, DC: Americans for the Arts.
  • Americans for the Arts. (2023). Arts and economic prosperity 6: The economic impact of nonprofit arts and culture organizations and their audiences. Washington, DC: Americans for the Arts.
  • Bloomberg Associates. (2021). Arts data in the public sector: Strategies for local arts agencies. New York: Bloomberg Associates. https://associates.bloomberg.org/arts-data-in-the-public-sector/
  • Bureau of Labor Statistics. (2024). Consumer expenditure survey: Entertainment category components. Washington, DC: U.S. Department of Labor.
  • Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research. Chicago: Rand McNally.
  • Crompton, J. L. (1995). Economic impact analysis of sports facilities and events: Eleven sources of misapplication. Journal of Sport Management, 9(1), 14–35.
  • Crompton, J. L. (2006). Economic impact studies: Instruments for political shenanigans? Journal of Travel Research, 45(1), 67–82.
  • Cwi, D., & Lyall, K. (1977). Economic impacts of arts and cultural institutions: A model for assessment and a case study in Baltimore. Baltimore, MD: Center for Metropolitan Planning and Research, Johns Hopkins University.
  • Duncanville Arts Foundation. (2026). Cultural investment strategy, version 2.0. Duncanville, TX: Duncanville Arts Foundation.
  • Grantmakers in the Arts. (2020). Public funding for arts and culture in 2020. Retrieved from https://www.giarts.org/public-funding-arts-and-culture-2020
  • McCarthy, K. F., Ondaatje, E. H., Zakaras, L., & Brooks, A. (2004). Gifts of the muse: Reframing the debate about the benefits of the arts. Santa Monica, CA: RAND Corporation.
  • Moss, I. D. (2009, September). Arts policy library: Arts and economic prosperity III. Createquity. Retrieved from https://createquity.com/2009/09/arts-policy-library-arts-economic-prosperity-iii/
  • Nardo, M., Saisana, M., Saltelli, A., Tarantola, S., Hoffmann, A., & Giovannini, E. (2008). Handbook on constructing composite indicators: Methodology and user guide. OECD Publishing, Paris. https://doi.org/10.1787/9789264043466-en
  • Seaman, B. A. (1987). Arts impact studies: A fashionable excess. In A. J. Radich (Ed.), Economic impact of the arts: A sourcebook (pp. 43–75). Denver, CO: National Conference of State Legislatures.
  • Seaman, B. A. (2011). Economic impact of the arts. In R. Towse (Ed.), A handbook of cultural economics (2nd ed., Chapter 28). Cheltenham: Edward Elgar Publishing.
  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.
  • Sterngold, A. H. (2004). Do economic impact studies misrepresent the benefits of arts and cultural organizations? Journal of Arts Management, Law, and Society, 34(3), 166–187.
  • Urban Institute. (2025, October). Understanding the impact of arts and culture on communities through local data. Washington, DC: National Neighborhood Indicators Partnership.
  • U.S. Census Bureau. (2023). American community survey 5-year estimates, 2019–2023. Washington, DC: U.S. Department of Commerce.
  • Wallace Foundation. (2010). The road to results: Effective practices for building arts audiences. New York: Wallace Foundation.
  • Walters, T., & Chandler, L. (2019). Towards a framework for measuring local government return on investment in arts and cultural development. Local Government Studies, 45(2), 165–185.