Skip to main content
Digital Fluency

Adult Ct And Digital Skills Transfer

Research · summaries

Compiled by: research agent, 2026-04-29 Purpose: Map what's empirically known about transfer of CT and digital skills training in adult/workforce populations (not K-12). Used to ground pedagogy.md and to scope the gap our project would fill.

Headline verdict

The empirical evidence base on adult computational thinking and digital-skills transfer is thin — far thinner than the K-12 literature. The asymmetry is stark and the doc should state it plainly.

What exists:

Why the gap: research funding has concentrated on K-12 and higher education; the "digital natives" framing of the 2000s–2010s led researchers to underweight adult reskilling; adult basic education is chronically under-funded for outcome research.

Strongest empirical findings

A. Urban Institute 2019 — smartphone-to-office transfer failure

B. RAND 2024 — Computer Foundations for low-tech adults (pilot)

C. Ye et al. 2022 — meta-analysis of 55 CT-transfer studies

D. CT-STEM 2024 meta-analysis — 37 studies, n = 7,832

E. Older-adults observational training — n = 59 (MDPI 2020)

F. Digital-divide systematic review (ERIC)

G. ProLiteracy — adult basic ed research review

H. OECD PIAAC — descriptive, not experimental

Mental-model acquisition for digital systems — what we know

Even sparser than the transfer literature.

Chi 2008 — Three Types of Conceptual Change

NN/G — cloud-storage mental models

Scaffolding framework (Belland)

Counter-evidence and complications

  1. Skills decline without practice (OECD PIAAC). Training transfer is not just a pedagogical problem — it requires sustained opportunity for practice post-training.
  2. Affect and self-efficacy may matter more for adults than K-12. Multiple adult studies show motivational/affective change as the proximal outcome. Implication: our co-pilot must build confidence, not just skill.
  3. Structural barriers dwarf pedagogical ones. Home internet, device, cost. Pedagogy can't fix this; product design must account for it (offline modes? library distribution?).
  4. The "third-level digital divide" is real. Even with equal access, outcomes diverge based on prior knowledge, self-efficacy, and opportunity.

Strength-of-evidence by claim (this is the table to put in pedagogy.md)

Claim Evidence strength
Adults can develop digital skills through training Strong (RAND, OECD, program evaluations)
Those skills transfer to novel contexts (far transfer) Weak (no direct evidence; inferred from RAND employment outcomes; confounded)
Adults develop new mental models for digital systems Extremely weak (theoretical frameworks only; no empirical studies)
Adult transfer is comparable to K-12 Unknown (no direct comparisons)
CT training improves adult problem-solving Not studied empirically in adults

What this means for pedagogy.md and the pitch

This is the most important framing decision: be honest about the evidence base and turn the gap into a contribution claim. A pitch that pretends K-12 CT findings transfer to adults will get caught. A pitch that explicitly states "the adult digital-fluency transfer literature is empirically thin and our deployment is structured to generate evidence" is rigorous and credible.

The combination of:

...is enough to write a serious pedagogy doc. The honesty about the gap is the credibility move.