At 9:03 AM, a 22-year-old engineer joins the morning standup with her camera on and her brain mostly off. She went to bed at 12:40 AM, which is when her body’s dim-light melatonin onset actually began — roughly two and a half hours later than her tech lead’s. She got six hours of sleep when her circadian system was calling for eight. By the time she gives her update, she has been awake long enough that her reaction time and accuracy on attention tasks are closer to a colleague who would not legally be allowed to drive home.

That last sentence is not a metaphor. The Dawson and Reid study published in Occupational and Environmental Medicine in 2000 (“Moderate sleep deprivation produces impairments in cognitive and motor performance equivalent to legally prescribed levels of alcohol intoxication,” vol. 57, pp. 649–655) established the now-standard comparison: 17 hours awake produces cognitive impairments comparable to a blood alcohol concentration of 0.05%, and 24 hours awake produces impairments comparable to 0.10% — above the legal driving limit in all 50 US states. The CDC’s NIOSH training module for nurses uses the same equivalence. The engineer in our standup isn’t lazy. She’s chronotype-mismatched. And there’s now a frankly enormous pile of biology saying that “just go to bed earlier” is closer in spirit to “just be taller” than to “just save more of your paycheck.”

351 Loci

The largest genome-wide association study of chronotype to date analyzed 697,828 individuals from UK Biobank and 23andMe and identified 351 genetic loci associated with morningness — up from 24 loci in earlier work (Jones et al., “Genome-wide association analyses of chronotype in 697,828 individuals provides insights into circadian rhythms,” Nature Communications 10:343, 2019). The loci are enriched in genes governing circadian regulation, cAMP signaling, glutamate signaling, and insulin signaling, with peak expression in retina, hindbrain, hypothalamus, and pituitary. Chronotype is a deep neurobiological trait, not a mood.

GWAS-based SNP heritability sits at 12–21%; twin-study heritability ranges from 21% to 52% depending on the instrument used, with the Munich ChronoType Questionnaire pulling toward the higher end (Leocadio-Miguel et al., Journal of Biological Rhythms 36(6), 2021). For comparison, that range overlaps with heritability estimates for adult height in most populations and exceeds heritability for a long list of traits we treat as obviously inborn.

It gets more specific. A PER3 variable-number tandem repeat polymorphism — discovered by Kalmbach et al. in Sleep 40(2), 2017 — biases the carrier toward morningness or eveningness depending on which allele they inherit. PER2 phosphorylation-site mutations cause Familial Advanced Sleep Phase Syndrome, in which entire family lines reliably fall asleep around 7:30 PM. A CK1δ tau mutation shortens the circadian period. The molecular clock is a transcription-translation feedback loop running on CLOCK, BMAL1, PER1/2/3, CRY1/2, and the REV-ERB and ROR nuclear receptors. None of those names appear on the standard productivity-coach reading list.

You should still read what Jones and colleagues actually measured carefully. Activity-monitor data from 85,760 participants showed that the 5% carrying the most morningness alleles slept on a schedule 25 minutes earlier than the 5% carrying the fewest. Twenty-five minutes is not three hours. The genetic component alone is modest; environmental and social factors then amplify it into the two-to-four-hour gap you actually see in the wild — partners, light exposure, work schedules, kids, caffeine, weekend recovery. So the honest version of the claim is: genes set a slope, the environment is the amplifier, and the resulting distribution of natural sleep timing is wide, stable, and largely involuntary.

The Adolescent Delay (and the Age of Junior Engineers)

Puberty drives an involuntary 1–3 hour delay in circadian phase, measured by dim-light melatonin onset (Carskadon, in Adolescent Psychopathology and the Developing Brain, Oxford University Press, 2007; Carskadon et al., PMC2820578, 2010). The delay peaks at about 19.5 years for females and 20.9 years for males, then slowly reverses with age. Cross-cultural data suggest the peak may arrive a few years earlier outside Western contexts, but the developmental arc is consistent: late teens and early twenties are biologically the latest-shifted of any life stage. It is not a phase of self-indulgence. It is the predictable output of a system that hasn’t finished maturing.

This matters for the standup story because the people most affected by a 9 AM forced-sync are the same people most likely to be junior. The peak chronotype delay age is roughly the age of a new-grad engineer. Senior engineering managers — who are older and have therefore drifted earlier — are running synchronous meetings on a schedule calibrated for someone else’s circadian system. The default isn’t neutral. It’s accommodating one biological subgroup at the expense of another, and the subgroup paying the cost is, on average, the one with less power to push back.

The default isn’t neutral. It’s accommodating one biological subgroup at the expense of another, and the subgroup paying the cost is the one with less power to push back.

What Forced Misalignment Does to a Body

If chronotype mismatch were merely uncomfortable, it would still be worth taking seriously. It isn’t merely uncomfortable.

The International Agency for Research on Cancer reclassified night shift work as Group 2A — “probably carcinogenic to humans” in its 2019 evaluation, published as IARC Monograph Volume 124 (2019/2020). The classification rests on limited evidence in humans for breast, prostate, colon, and rectal cancers, sufficient evidence in experimental animals, and strong mechanistic evidence covering melatonin suppression, circadian gene disruption, immune modulation, and metabolic perturbation. Roughly one in five workers globally is in some form of regular night shift work, by IARC’s own estimate.

A 2025 dose-response meta-analysis of 23 cohort studies covering 3,340,377 participants quantifies the cardiovascular cost (Li et al., PMC12506678, 2025): a 13% increase in total cardiovascular disease incidence, a 27% increase in CVD mortality, a 22% increase in coronary heart disease, and a 49% increase in stroke mortality. The dose-response is linear — every additional five years of shift work adds another 7% to CVD incidence — with no observed safe threshold. Meta-analyses of metabolic syndrome put the excess risk in shift workers at roughly 11–30% (Wang et al., PubMed 33556868, 2021).

The economic toll on the public side of this ledger is in the same neighborhood. The National Highway Traffic Safety Administration estimates fatigue-related crashes resulting in injury or death cost society about $109 billion annually, not counting property damage. The AAA Foundation for Traffic Safety’s analysis of crashes between 2017 and 2021 found that drowsy drivers were involved in 17.6% of fatal crashes — substantially more than police-reported figures, indicating significant underreporting. Night and rotating shift workers carry six times the drowsy-driving risk of day workers. The cost is paid by everyone on the road, not just the worker.

None of this means a single forced 9 AM standup will give anyone cancer. The IARC and CVD data come from chronic, structural circadian misalignment — years of inverted schedules — not from a Tuesday meeting. But the engineering workplace reproduces the same forced-mismatch structure at smaller scale: a fixed clock, a biological system not calibrated for it, and an accumulated cost paid in attention, working memory, decision-making, mood, and (if you stay in the industry long enough) cardiovascular risk. The mechanism is the same; the dose is smaller. The mechanism is also the part with the most evidence behind it.

The Natural Experiment That Already Ran

The good news is that we already have a published, regulated, before-and-after natural experiment on how to fix exactly this kind of forced misalignment. We just ran it on teenagers instead of engineers.

In 2014, the American Academy of Pediatrics issued a policy statement in Pediatrics (vol. 134, no. 3, pp. 642–649) recommending that middle and high schools begin no earlier than 8:30 AM. The CDC backed the recommendation but reported that as of the 2011–12 school year, only 17.7% of US middle and high schools met it. The Start School Later coalition has since assembled endorsements from more than 35 professional organizations: AAP, AMA, APA, the National Sleep Foundation, the National Education Association.

In October 2019, California Governor Gavin Newsom signed Senate Bill 328, the first US state law mandating later school start times: middle schools no earlier than 8:00 AM, high schools no earlier than 8:30 AM, with an exemption for rural districts. The law took effect July 1, 2022 (or later, depending on collective bargaining timelines).

The documented results — across multiple studies that the AAP policy statement and subsequent literature draw on — are consistent: students gain 25–50 minutes of sleep per night, daytime sleepiness drops, attendance improves, tardiness and truancy fall, grades rise, mood and motivation improve, and computerized attention-task performance increases for middle schoolers. The intervention costs nothing in instructional minutes. It accommodates biology rather than fighting it. It works.

The structural insight: the school start time policy doesn’t ask students to need less sleep. It doesn’t ask them to want to wake up earlier. It moves the institutional clock to accommodate the population’s actual biology. The engineering equivalents — asynchronous standups, results-not-hours culture, non-overlapping core hours, shifting individual schedules by two or three hours — are structurally identical interventions on a smaller institutional canvas.

Why the Defaults Got Set Where They Did

It is worth asking why 9 AM became the default in the first place, because the answer reveals what’s actually going on when someone says “we need everyone in by then.”

Office work inherited factory hours, which inherited agricultural hours, which were tuned by daylight and animal husbandry. None of those inputs apply to a remote engineering team writing code in three time zones. The Microsoft Work Trend Index 2024 reports that 78% of developers cite meeting overload as their biggest productivity problem; Linear’s 2025 engineering team study put the time lost to poorly coordinated meetings at 4.2 hours per developer per week. GitLab’s 2025 Remote Work Report puts 63% of engineering teams across three or more time zones — making a synchronous standup inherently exclusionary regardless of chronotype. UC Irvine’s classic interruption-recovery research (Mark et al.) found it takes about 23 minutes to fully regain concentration after an interruption. A 15-minute standup at the wrong time, for the wrong person, costs roughly 40 minutes of productive work.

The async alternatives report large gains: organizations running async-default workflows report 29% higher self-rated productivity and 53% more focus, with 61% of employees citing better work-life balance (industry surveys compiled in 2025). Those are self-reports, not RCTs, and they should be discounted accordingly. But the direction is clear, and the mechanism is intuitive: removing the synchrony requirement removes the chronotype tax that the synchrony requirement was always imposing.

Where This Argument Is Weakest

A careful reader has three good objections, and they all deserve direct answers.

Objection 1: “Just go to bed earlier” does work — sometimes. Facer-Childs et al. (Sleep Medicine 60:236–247, 2019) showed that a behavioral intervention combining bright morning light, fixed wake times, and curtailed evening caffeine could shift night owls’ sleep onset by about two hours earlier, with measurable improvements in depression, stress, and cognitive performance. The catch: the intervention worked when participants also had flexible social schedules. Under forced early schedules, the intervention’s benefit washed out into social jet lag. The fix isn’t “biology is destiny”; it’s that the personal intervention only helps when the institution stops fighting it.

Objection 2: The cognitive penalty isn’t as universal as breathless framings suggest. A 2023 paper in Collabra: Psychology (vol. 9, no. 1, article 88337) reviewed the chronotype-by-time-of-day cognitive literature and concluded there is “no general and robust cognitive boost” from chronotype-time synchrony across all tasks. The honest reading: the effects cluster in sustained attention, declarative memory, and executive control. Other domains — including some kinds of creative or insight problem-solving — are paradoxically better at non-optimal times because reduced executive control allows more associative thinking. So your 9 AM standup doesn’t necessarily ruin a night owl’s whole day. It just ruins the specific cognitive faculties most relevant to engineering work, while possibly interrupting a creative window that would have been productive in a different mode.

Objection 3: Chronotype shifts with age, so accommodating night owls today means inconveniencing the same people in their fifties. True. The same delay that makes adolescents and early-career engineers late-shifted reverses with age; retirees often become extreme morning types. The implication isn’t that the workplace should pick one chronotype to favor. It’s that the workplace shouldn’t force anyone onto someone else’s schedule, because the people being inconvenienced now will be the senior managers in twenty years inconveniencing the next cohort.

The honest, narrower thesis: forced synchronous standups at fixed early hours impose a measurable, biologically-grounded tax on a sizable fraction of the workforce. The tax is heaviest on junior engineers (who are at peak phase delay), in remote teams across time zones, and on the cognitive faculties most relevant to deep technical work. The accommodation costs nothing the team was actually producing.

The Practical Instrument

If you run an engineering team, the operating moves are concrete and don’t require anyone to change their biology.

  1. Make standups asynchronous by default. Each engineer posts an update during their own peak hours — text or a short recording into a shared channel. The synchronous version becomes a 30-minute weekly call with an agenda, not a daily ritual.
  2. Define a narrow core overlap, not a fixed start time. Two to four hours of overlap is enough for synchronous decisions when they’re genuinely needed. The rest of the day belongs to whoever is awake.
  3. Schedule the meetings that must be synchronous around the chronotype distribution of the actual team, not the lead’s preference. A 10:30 AM or 1:30 PM slot is a far better default than 9:00 AM.
  4. Decouple presence from output. If your team is measuring hours-at-desk rather than shipped work, the chronotype mismatch is the smallest of your problems.
  5. Watch your own language. “Why are you so tired?” and “you should sleep earlier” are value judgments about whose biology is the standard. The biology of the person you’re addressing is not in the wrong; the schedule is.

Chronotype is one of the few large-effect, well-genotyped, heavily-replicated traits in human biology that the modern workplace still treats as a character flaw. The school start time experiment has already shown us, on a much larger scale and with much more rigorous outcome measurement, what happens when an institution accommodates the trait instead of penalizing it. Engineering teams have every reason to run the same experiment, and almost no reason not to.

The 22-year-old engineer at the 9:03 AM standup isn’t a productivity problem. She’s a measurement. The thing she’s measuring is what the team has decided to optimize for. The interesting question isn’t whether she can wake up earlier. It’s why the team decided that was the question worth asking.


Sources: Jones et al., “Genome-wide association analyses of chronotype in 697,828 individuals provides insights into circadian rhythms,” Nature Communications 10:343, 2019; Leocadio-Miguel et al., Journal of Biological Rhythms 36(6), 2021; Kalmbach et al., Sleep 40(2), 2017; Carskadon, in Adolescent Psychopathology and the Developing Brain, OUP, 2007; Dawson & Reid, Occupational and Environmental Medicine 57:649–655, 2000; IARC Monograph Volume 124, 2019/2020; Li et al., PMC12506678, 2025; Wang et al., PubMed 33556868, 2021; AAP Policy Statement, Pediatrics 134(3):642–649, 2014; California Senate Bill 328, 2019; Facer-Childs et al., Sleep Medicine 60:236–247, 2019; Collabra: Psychology 9(1):88337, 2023; Mark et al., UC Irvine interruption-recovery research; Microsoft Work Trend Index 2024; Linear Engineering Team Study 2025; GitLab Remote Work Report 2025; NHTSA fatigue-crash cost estimate; AAA Foundation for Traffic Safety 2017–2021 analysis.

Async-First Coordination, Built In

The second move on the checklist — asynchronous updates posted during each engineer’s peak hours — is easy to say and hard to trust without a verifiable record of who did what and when. Chain of Consciousness gives each agent a signed, timestamped activity log that replaces the synchronous standup’s accountability function without the synchronous standup’s circadian tax.

pip install chain-of-consciousness
npm install chain-of-consciousness

For distributed teams that need durable, timestamped provenance without running their own store, Hosted Chain of Consciousness ships the verification layer as a service. Build the async accountability layer into the architecture before the next standup.