F3 Technology Partners | Transform your IT Infrastructure

The Job Description Is Starting to Fray

Work is shifting at the task level, while hiring models are falling out of alignment

Michael Hlasyszyn

Director, AI Alliance Programs

Water flows through the Constitution Plaza Fountain, surrounded by office buildings at Constitution Plaza in downtown Hartford, Wednesday, July 30, 2025. (Aaron Flaum/Hartford Courant)

At a workforce readiness session, hosted by the MetroHartford Alliance in downtown Hartford this week, more than 40 leaders from across Connecticut’s businesses gathered with a shared objective: preparing their workforce for what AI is already beginning to change.

The conversation was not centered on AI experimentation. Most in the room had already begun integrating AI into parts of their operations. The focus was more pragmatic on how to adapt roles, responsibilities, and hiring strategies as adoption accelerates.

A question surfaced that proved more difficult than the rest. Not which AI tools to adopt, or how quickly to deploy them. The more difficult question was quieter, and less resolved: What, exactly, are organizations hiring for now?

The group reflected a broad cross-section of Connecticut’s economy -insurance, healthcare, finance, manufacturing,  government, education, and professional services. Across that spectrum, a consistent pattern was emerging. Work is changing in ways that do not align neatly with existing roles.

Work Is Changing And Not in Ways That Show Up Immediately

Tasks that once defined entire positions – analysis, reporting, content creation – are increasingly being supported, and in some cases partially completed, by AI systems. The shift is incremental, uneven, and often difficult to measure directly. What is visible is how time is being reallocated. Less time is spent generating initial outputs. More time is spent reviewing them, validating accuracy, and determining whether they should be acted upon. The work itself has not disappeared. It has been restructured.

This aligns with broader research from Massachusetts Institute of Technology, including the Iceberg framework discussed during the session emphasizing that AI exposure occurs at the task level rather than the job level. Roles remain intact, but their internal composition is shifting. In many cases, responsibilities that were once integrated – producing and evaluating work – are beginning to separate, introducing new ambiguity around ownership, expertise, and accountability. AI may contribute to outputs, but responsibility for validating and acting remains with individuals – often without clear boundaries.

Hiring Models Are Lagging Behind the Work Itself

Job descriptions continue to group together sets of responsibilities that were historically performed in tandem. They assume consistency in how work is executed and level of effort required across those tasks. That assumption is becoming less reliable. Organizations are beginning to place greater emphasis on judgment, interpretation, and the ability to evaluate outputs, rather than on the production of those outputs alone. In effect, the criteria for value are shifting, even if job definitions have not yet been formally updated. This transition is also exposing internal differences. More experienced employees often rely on accumulated context to assess whether outputs are credible. Less experienced employees may be more proficient in using AI tools, but less equipped to evaluate their limitations.

Frameworks such as the EPOCH model, developed at MIT Sloan School of Management, attempt to define the capabilities that remain distinctly human: judgment, creativity, intuition, contextual reasoning. These are becoming more critical, not less.

Governance Is Advancing Alongside Adoption

At the same time, regulatory expectations are evolving as we previously reported on in How Connecticut Is Structuring the Future of Manufacturing AI. Guidance such as Connecticut Insurance Department Bulletin MC-25 establishes that AI use must remain accountable, particularly in regulated domains like insurance. Legislative activity, including Connecticut Senate Bill 5, signals that oversight will continue to expand. This reflects a broader tension discussed during the session: organizations are being asked to accelerate adoption while governance expectations are increasing at the same time.

A Structural Shift Without a Defined Model

Taken together, these changes do not point to a simple narrative of job displacement or job creation. The composition of work is changing before the systems used to define, hire for, and evaluate that work have adapted. Organizations are, for the moment, managing this transition informally by adjusting roles incrementally and redefining expectations in practice. Over time the gap between how work is performed and how it is structured is likely to become more pronounced. The questions raised in Hartford remain unresolved.

How is work and the people responsible for it ultimately defined?

 

At F3 Technology Partners, these are the conversations we are increasingly seeing across organizations where understanding what AI can do and redefining how work is structured are happening at the same time.

Verified by MonsterInsights