
Technical fluency, not seniority or age, explains AI adoption
December 2025 | Article
By Marianne van Ooij
Our analysis shows that AI capability emerges from how people work with systems, not who they are.
Many organizations design AI enablement programs around demographic assumptions. Younger employees are expected to adopt new tools more readily. Senior leaders are expected to learn later and cascade practices downward. Long tenured staff are sometimes seen as more resistant to change. These expectations shape training plans, workforce segmentation, and assumptions about where capability will emerge.
A structured assessment of sixty nine employees in a flat, knowledge focused organization provided an opportunity to test these assumptions. The analysis followed enterprise AI fundamentals training and combined qualitative interviews with a standardized maturity scale.
The intent was to understand which factors meaningfully predict AI adoption and which do not.
ARTICLE
What does predict AI adoption
Demographic factors show no relationship with AI maturity
Our analysis revealed no meaningful relationships between adoption and demographic factors.
Age, tenure, and organizational level showed negligible correlations with AI maturity. Younger employees did not adopt faster than older colleagues. Recent hires performed similarly to long tenured staff. In a flat structure where titles reflect domain expertise rather than management hierarchy, senior domain experts showed no advantage over individual contributors.
Maturity varied within every demographic group, but no systematic pattern emerged. Specialized expertise in finance, research, policy, or certification did not transfer to AI fluency.
Key insight: Age, tenure, seniority, and domain expertise showed no meaningful relationship with AI adoption
What does predict AI adoption
Technical fluency as the dominant predictor
The strongest predictor of AI adoption was technical fluency. The correlation between technical fluency and AI maturity (r = 0.71) was substantially higher than for any demographic factor.
Technical fluency is not defined by digital native status, comfort with consumer technology, or familiarity with AI tools. It is a practical capability that develops through repeated interaction with structured digital systems. It reflects four underlying components:
System
literacy
Tool comprehension
Workflow thinking
Data
literacy
Ability to understand how digital systems structure information and how tools signal errors.
Technically fluent employees read these signals as diagnostic rather than blockers.
Ability to understand how tools process inputs.
For AI, this means recognizing that models rely on statistical patterns, which shapes how users structure prompts, review outputs, and run variations.
Ability to break work into defined steps with checkpoints and clear transitions across tasks.
This mirrors AI workflows and enables employees to see where a process failed or where AI can add value.
Comfort working with structured information in spreadsheets, databases, or content systems.
This includes verifying outputs, spotting inconsistencies, and incomplete source data.
These capabilities cut across age and organizational level. A fifty five year old employee who is skilled at navigating systems, structuring work, and working with data will adopt AI faster and more effectively than a twenty five year old whose experience is limited to consumer based technologies.
“I can tell when AI is wrong because I know the underlying logic.”

Why technical fluency matters for AI
AI tools perform best when tasks are structured, inputs are clear, and users understand how to evaluate outputs. Employees with high technical fluency naturally create the conditions in which AI performs well. They break tasks into steps, provide contextual structure, adjust inputs when outputs drift, and recognize when an answer does not make sense. Employees with lower fluency struggle to structure tasks, interpret errors, or evaluate outputs, which limits both confidence and usage.
"If the spreadsheet is clean, AI works. If the spreadsheet is messy, nothing works.”
How to apply these findings
Recognize that team context shapes individual adoption
Technical fluency predicts individual capability, but team-level factors determine sustained adoption. Teams with clear governance and supportive leadership adopt faster even when individual technical fluency varies. Organizations should address barriers at the team level rather than focusing solely on individual skill gaps.
Address universal blockers systematically
Certain challenges limit adoption across all fluency levels: document ingestion difficulties, accuracy mistrust, and unclear guidance on appropriate use. Solving these structural barriers accelerates adoption more effectively than capability-based training tiers alone.
Avoid demographic proxies in hiring and development
For new hires, assess technical fluency through behavioral questions about how candidates structure work and troubleshoot problems rather than relying on age or resume signals. For existing employees, recognize that technical fluency can be strengthened at any career stage through practice with structured digital tools. Organizations that wait for demographic turnover will lag behind those that invest in capability building across all employees.
Conclusion
Organizations that design AI programs around assumptions about age, tenure, or seniority allocate resources inefficiently. The evidence shows that technical fluency, defined by system literacy, tool comprehension, workflow thinking, and data literacy, predicts AI adoption far more reliably than any demographic variable. These capabilities are practical, learnable, and distributed unevenly across all levels and age groups.
Organizations can accelerate AI adoption by starting with universal training that creates shared language, following up with individual conversations that reveal actual barriers, and intervening at the team level where culture and permission structures shape sustained adoption. Fixing universal blockers such as document ingestion challenges, accuracy mistrust, and governance gaps increases capability faster than segmented training programs.
In knowledge oriented environments, deep domain expertise does not automatically translate to AI fluency. Digital comfort does.
About this research
These insights are based on a multi-month implementation study involving more than 70 interviews across ten functions that I led at a global professional organization.
Data sources included workflow walkthroughs, unstructured interviews, document analysis, and quantitative correlation of AI maturity with baseline skills. Patterns were validated across roles and departments to ensure consistency.
These findings align with patterns I've observed across transformations.
ABOUT THE AUTHOR(S)
Marianne van Ooij is the founder of AI UX Navigator
RELATED ARTICLES

