1. Choi Eun-jeong's Dawn
November 2025, Eunpyeong-gu, Seoul. Five in the morning.
The fluorescent lights in the care facility corridor burn at half strength. Choi Eun-jeong (52) tightens her shoelaces and walks up to the third-floor hallway. Seven years running — the same hour, the same corridor. The clock on the wall reads 5:04 a.m. Through the window at the end of the hall, apartment lights from Eunpyeong-gu are visible. Most are still dark.
On the nurse's station monitor, real-time readings for 32 residents scroll past — blood pressure, heart rate, blood glucose. The AI health-monitoring system was installed last year. When it detects an overnight anomaly, it sounds an alert. No alerts sounded through the night. By the numbers, it was a quiet night. But Choi Eun-jeong knows what the numbers do not show.
She stops in front of Room 302. Before opening the door, she already knows. Today will be a hard day. When the sky is overcast, the hard days come. Eighty-seven years old, dementia grade 3. Her husband passed four years ago — but on cloudy days, she looks for him.
She opens the door. The grandmother is sitting up in bed, blanket thrown back, slippers on her feet. She had been about to go out.
"Grandma."
The grandmother looks at Choi Eun-jeong. Her eyes are wet.
"The old man hasn't come back. I have to go outside and look."
Choi Eun-jeong sits down beside her. Takes her hand. It is cold. The heat is running, but hands are always cold at dawn. The veins on the back of the hand stand out. Choi Eun-jeong knows the map of this hand. She has been holding it for seven years.
"Grandma, he just stepped out for a moment. He'll be right back."
This is a lie. Choi Eun-jeong knows it. The grandmother herself knows it two or three times a day. But not at this hour, not in this moment.
Choi Eun-jeong's lie is not a medical prescription — it is a relational judgment. Telling the truth would make the grandmother live through her husband's death again. Every time, as if for the first time. Choi Eun-jeong has been making that judgment, day after day, for seven years.
The AI monitoring system knows the grandmother's blood pressure and heart rate. Choi Eun-jeong knows the grandmother's loneliness. That difference is everything.
While she holds the grandmother's hand, the old woman's eyes close slowly. Choi Eun-jeong lays her back down, pulls the blanket up. 5:23 a.m. She moves on to the next room.
Choi Eun-jeong's hourly wage is ₩12,000. Monthly pay: ₩2.2 million. Seven years in. Divorced, she lives alone. Her son is 26 — a delivery rider working the outskirts of Seoul.
This is not function but relationship. The essence of care is "being there" — in this dawn, in this corridor, holding this hand.
The AI system can monitor the grandmother's vital signs. It can flag medication times. But when the grandmother goes looking for her dead husband, the act of sitting beside her, taking her hand, and saying "he'll be right back" — that is not something a machine does. Not because it cannot. Because society does not entrust it to one.
2. Scarcity Moves
What is scarce is not fixed. It moves with each era.
In the middle Roman Republic, land was scarce. After the Punic Wars, the great estates — latifundia — spread, absorbing the holdings of smallholders. Book 1, The Displaced and the Discerning, traced that process. Those who held land were citizens; those who lost it were classified as proletarius — people with nothing left to contribute to the state except their offspring.
When land was scarce, those who lost it lost their identity.
In the industrial revolution, skilled labor was scarce. The Lancashire handloom weaver controlled the tension of warp and weft with the sensitivity of his fingertips. Seven years to learn the craft. When the spinning jenny and the power loom replaced those seven years with mechanical repetition, the weaver's weekly wage collapsed from 25 shillings to 4.5 shillings — 84 percent gone.
The moment skilled labor was no longer scarce, the skilled worker was pushed aside.
In the age of the Medici bank, which Book 3, The Invisible Hand's Last Trade, analyzed, information was scarce. In an era when a letter from Florence to Bruges took two weeks to arrive, those who held information first controlled capital. The Medici appraiser's value came from that asymmetry.
Each time the printing press, the telegraph, and the internet narrowed that asymmetry, the gatekeeper's value shifted — from knowing to guaranteeing. As Book 3 put it, the gatekeeper did not disappear. It simply moved from visible ground to invisible ground.
What becomes scarce in the AI era?
Cognitive ability is losing its scarcity. GPT-4 scored in the top 10 percent of the U.S. bar exam. AlphaFold surpassed human experts in protein structure prediction.
Eloundou et al.'s 2023 analysis found that 80 percent of U.S. occupations have at least 10 percent of their tasks exposed to large language models. Exposure was highest in high-income, high-credential occupations. This is a different pattern from past automation. The industrial revolution replaced manual labor from the bottom up. AI penetrates cognitive labor from the top down.
When cognitive ability is no longer scarce, what becomes newly scarce?
Four things: judgment, trust, care, meaning. Domains where machines can perform but society will not assign. The boundary is drawn not by capability but by trust. This is the coordinate of this chapter — and of all of Part IV.
3. The Contradiction of ₩12,000 per Hour
Korea has approximately 550,000 to 600,000 active eldercare workers. Cumulative certificate holders exceed 2 million, but fewer than one in three are actively working. Hourly rate: ₩12,000. Even including night and weekend supplements, monthly take-home pay for residential care workers runs ₩1.8 million to ₩2.2 million.
Annual turnover: 30 to 40 percent. Workers with fewer than three years of tenure make up more than half the workforce.
Low wages alone do not explain why workers leave. Emotional exhaustion does. Holding someone's hand every dawn, staying beside a person losing their memory, watching death up close — when that extends across seven years, ten years, what accumulates is not a career but fatigue. The formal certification for eldercare work takes roughly six months, including all required training hours. Society asks six months of credentials for this work. It took Choi Eun-jeong seven years to read the overcast days of the grandmother in Room 302. Those seven years appear on no certificate.
The average annual salary of an NVIDIA engineer in the United States is $150,000 to $250,000 — ₩200 million to ₩330 million at current rates. Converted to hourly wages, that is 15 to 25 times Choi Eun-jeong's rate. Between the time of the person building AI and the time of the person doing work AI cannot replace, there is a gap of 15 to 25 times.
This is a contradiction. The most irreplaceable work is the most undervalued. The market prices the replaceable highly and the irreplaceable cheaply. Whether that is a contradiction or simply the nature of markets is a matter of perspective. But the outcome is clear.
Why. Nancy Folbre called this the "care paradox." Care labor cannot be priced by market logic. The more care is commodified, the more its quality deteriorates. The demand to care faster, more efficiently, conflicts with the very nature of care.
If the time Choi Eun-jeong spends holding the grandmother's hand is shortened, the value of that time does not rise — the care itself disappears. Folbre's diagnosis is that care goes unpriced precisely because it is "labor of love."
Mariana Mazzucato approaches the same problem from a different angle. Modern economics cannot distinguish between "value creation" and "value extraction." There is a structural gap between what GDP measures and what society actually needs.
NVIDIA's market capitalization exceeded $3 trillion at its peak in 2024. Is that a result of value creation or value extraction? How is Choi Eun-jeong's act of holding the grandmother's hand at 5 a.m. recorded in GDP? As a fraction of the long-term care insurance benefit schedule — as ₩12,000 per hour of labor cost.
The distance between $3 trillion and ₩12,000 is the structural failure of the value-measurement system.
Korea entered super-aged society in 2025. The population aged 65 and over exceeded 20 percent — the fastest aging pace among OECD countries. From aged to super-aged society: seven years. France took 39 years, the United States 15, Japan 10.
Long-term care recipients are projected to grow from roughly 1.1 million in 2024 to 1.5 to 1.7 million by 2030. Demand for eldercare workers is surging while working conditions remain stagnant.
The gap between demand and compensation widens every year. The number of people who need care grows; the number willing to provide it shrinks. The idea of filling that gap with technology has emerged. Is it possible? Even if it is possible, is it desirable? Those questions return in the second half of this chapter.
The "value" of what Choi Eun-jeong does at 5 a.m. is recorded nowhere. The dawn when the grandmother does not go looking for her husband and drifts back to sleep — there is no figure to measure the value of that dawn.
4. From Capability to Trust
In 2013, Carl Benedikt Frey and Michael Osborne analyzed 702 U.S. occupations and declared that 47 percent faced high automation risk. They defined occupational security by what machines could not do. The paper became the original text of the automation debate — the high-water mark of the capability criterion.
The problem is that the capability frontier retreats every year.
Occupations that were "safe" in 2013 — translation, legal document review, medical imaging, basic coding — were already targets of automation by 2024. Even in what Frey and Osborne called the domain of "creative intelligence," Midjourney took first place at an art competition.
Define "the last profession" by the capability criterion, and that definition must be rewritten each time the next model is released. That is not a definition — it is a countdown.
The capability frame is defensive. Behind "it cannot do this yet" always comes "it will soon." A different frame is needed.
Daron Acemoglu and Pascual Restrepo's 2019 research offered one. Automation displaces existing tasks, but simultaneously creates new ones. The question is the net effect.
For AI, the speed and scope of displacement are qualitatively different from past automation. Simultaneous displacement across cognitive labor as a whole may not leave enough time for new task creation to catch up.
David Autor pushed further in a 2024 paper. AI can become a tool for expert judgment, but not its substitute. Expert judgment depends on context, rests on tacit knowledge, and carries responsibility for the consequences of decisions.
What Autor called "the labor of division" — judgment in exception situations where patterns break down — remains in the human domain.
Yet even Autor's argument stays within the capability frame. It is still talking about what AI "cannot" do. A more fundamental shift is needed.
Define occupations by what AI "cannot" do, and the definition collapses as the capability frontier retreats. Define occupations by what society will "not assign" to AI, and the standard belongs to society, not to the technology.
The EU AI Act came into force in August 2024. It classified AI in the domains of justice, healthcare, employment, and education as "high-risk" and imposed human-oversight obligations. Even if AI is more accurate, a human must remain in the loop — that is the design philosophy written into the law.
As of 2024, the FDA had approved more than 950 AI medical devices — but all are classified as "assistive" or "clinical decision support." No AI medical device holds independent diagnostic authority. The sole exception, IDx-DR, is limited to binary screening for a single condition — diabetic retinopathy — and treatment decisions remain with the ophthalmologist.
A 2023 Pew Research survey found that 60 percent of Americans said they were uncomfortable with AI use in healthcare, and 75 percent said they wanted a human physician to confirm even an AI diagnosis. Patient psychological safety — not technical accuracy — is drawing the line.
Korea's AI Basic Act (인공지능 기본법) mandated transparency obligations for high-impact AI. As of 2024, Korea's Ministry of Food and Drug Safety had approved more than 200 AI-based medical devices — yet not one holds independent diagnostic authority.
Three continents' regulators independently reached the same conclusion.
No matter how far AI's technical capability advances, human beings must remain present in high-stakes decisions. It is not capability but trust that draws the boundary. This shift — from the capability criterion to the trust criterion — is the pivot.
5. Two Signatures — The Medici Appraiser and the Consultant
Around 1470, Florence.
A Medici bank appraiser spreads the documents a merchant from Bruges has presented. A request to issue a letter of credit for a wool trade. The appraiser reads the documents — but not only the documents. He reads this merchant's reputation. He reads between the lines of the letter sent from the Bruges branch.
Not the single line "He is trustworthy," but the fact that the branch manager who wrote that line staked his own name on it.
The appraiser signs. That signature stakes the Medici bank's reputation. If the appraiser is wrong — if the merchant cannot pay for the wool — the loss belongs to the bank, and the responsibility belongs to the appraiser. The moment he signs, he binds himself to his judgment.
In Book 3, The Invisible Hand's Last Trade, the Medici bank's appraiser guaranteed international commerce with a single signature. Five hundred and fifty years later, the same structure operates under a different name.
2025, Pangyo, Seoul.
An AI ethics consultant reviews the decision logic of an autonomous-driving algorithm. How is the algorithm designed to choose when the safety of a pedestrian and the safety of a passenger conflict at an intersection? She analyzes simulation data, examines edge cases, checks alignment with ethical guidelines.
She writes the report. Signs the last page. The moment the pen lifts, her hand pauses. If this signature fails, the name on the front page of the newspaper will not be the algorithm's — it will be hers.
"Is this algorithm safe to deploy on public roads?"
She answers that question in her own name.
Five hundred and fifty years lie between the two chairs. The technology changed from bill of exchange to algorithm. The structure is identical. The signing human stakes her reputation.
If the appraiser is wrong — if the algorithm causes an accident — that consultant's name becomes the point where responsibility comes to rest. The algorithm loses no reputation. It is not fired. It does not stand in court.
The human loses. That capacity for loss is the essence of the guarantee. Only those who can lose can guarantee. This is the structure linking the Medici appraiser and the AI ethics consultant — a principle unchanged across 550 years.
At 3 p.m., Choi Eun-jeong enters Room 302 again. The grandmother is awake. This time she is not looking for her husband. She recognizes Choi Eun-jeong.
"Eun-jeong, have you eaten lunch?"
Choi Eun-jeong smiles. "I have, Grandma."
This is not a lie. A real conversation. The grandmother's days alternate between lies and truth — there is a rhythm in which the morning lie — "he'll be right back" — settles her, and the afternoon truth arrives. Choi Eun-jeong has been reading that alternating rhythm for seven years.
The problem is not the lie itself. The problem is that Choi Eun-jeong herself is not certain the lie is right. Is repeatedly telling a dementia patient of a spouse's death cruel, or is it respecting the patient's right to the truth? In the 320 hours of the eldercare certification curriculum, there is no item called "the ethics of the lie."
Every morning, Choi Eun-jeong makes that judgment alone. Just as the Medici appraiser staked his reputation on a single signature, Choi Eun-jeong stakes her conscience on a single lie. The difference is that the appraiser's signature came with an annual salary, and Choi Eun-jeong's lie comes with ₩12,000 per hour.
6. The Paradox of Care, The Inversion of Value
The paradox of care has a deeper structure.
According to Folbre's analysis, care work carries a "care penalty." Working in a care occupation imposes a 15 to 25 percent wage disadvantage relative to non-care occupations of equivalent education and experience.
This penalty originates in the structure that historically classified care as a "natural" female role rather than professional labor. The fact that more than 90 percent of Korean eldercare workers are women compresses that structure into a single statistic.
Mazzucato's value theory extends the problem to the entire economic system. GDP measures only what is transacted in markets. Unpaid care performed at home — nursing elderly parents, raising children, household labor — is not included in GDP.
OECD estimates suggest the economic value of unpaid care labor could reach 10 to 15 percent of GDP. Korea's share of single-person households stands at roughly 35 percent as of 2024, and elderly single-person households number approximately 1.9 million. In a structure where informal family-based care no longer functions, the value of paid care labor grows — but its price remains stuck at the floor.
In the AI era, this contradiction deepens. When AI drives the price of cognitive labor toward zero, what happens to the price of care labor? Baumol's cost disease operates here.
Care is a domain where technology cannot easily raise productivity. The time spent holding the grandmother's hand cannot be shortened by technology. When productivity in other sectors explodes through AI, the relative cost of care labor rises. But care workers' wages are tied to long-term care insurance benefit schedules — they do not follow market supply-and-demand logic.
On top of this sits the problem of asymmetric transition costs. Once handed off to AI, the handback is extraordinarily difficult.
Air France Flight 447 is the case. On June 1, 2009, an Airbus A330 flying from Rio de Janeiro to Paris crashed into the Atlantic. When pitot tubes froze and airspeed data disappeared, the autopilot disengaged. Three pilots were in the cockpit. But they were not practiced in manual flight.
The co-pilot pulled the nose up. This was the opposite of basic aerodynamics — when speed is lost, the nose must go down. The warning tone sounded 75 times. Over three minutes and 30 seconds, the aircraft fell from 38,000 feet to the surface. 228 people died.
It was the disaster of pilots accustomed to automation confronting the moment manual flight was required. Book 3 analyzed the structure of this accident — and it applies to care as well.
The same pattern is appearing in medicine. One study found that when specialists relied on AI-assisted colonoscopy systems, their adenoma detection rate fell from 28 percent to 22 percent without AI assistance. Technology was not augmenting human ability — it was eroding it. A paradox.
In care, this paradox is more lethal. If the number of eldercare workers is reduced on the basis of AI-assist system deployment, the people capable of doing what Choi Eun-jeong does at dawn — reading the grandmother's loneliness, holding her hand, saying the right thing — diminish.
Once that ability has atrophied at the social level, deciding "after all, we need people" comes too late. The people are already gone.
In Book 5, The Strategy of the In-Between, we saw Japan's kodokushi 孤独死 — solitary death — and the PARO robot. The scene of a robotic seal being stroked by an elderly person's hand was the quietest question about where the boundary of care lies. Japan simultaneously holds more eldercare robots than any country in the world and more solitary deaths than any country in the world. The 2024 estimate puts solitary deaths among those aged 65 and over at approximately 68,000 per year.
These two facts are not a contradiction — they are two faces of the same structure. Technology provides function but not relationship. The difference between function and relationship is what decides the boundary of care.
7. The Four Outlines
In the space scarcity has vacated, four domains become visible: judgment, trust, care, meaning.
This is not a fixed list — it is a socially negotiated boundary. Chapters 12 and 13 will examine each in depth; here the outlines are drawn.
Judgment. Decisions in which existential stakes are present. Errors cause death, lost freedom, or lost assets.
China operates an AI judge system. A structure in which AI drafts rulings in routine civil cases while human judges sign has become standard. A 97 percent accuracy rate is claimed — but no independent verification has been performed. And accuracy is not the point.
Even if an AI judge is 97 percent accurate, there is no answer to the question "who is responsible" for the 3 percent of errors. Judges can be impeached, disciplined, and judged by history. An algorithm suffers none of these consequences.
America's COMPAS algorithm predicts recidivism risk — but ProPublica reporting revealed that its false-positive rate for Black defendants was roughly double that for white defendants. The Wisconsin Supreme Court ruled that COMPAS scores cannot serve as the "sole basis" for sentencing. The criterion was not accuracy but due process — institutional legitimacy.
The same structure repeats in autonomous driving. Waymo is expanding; Cruise was effectively dissolved following a pedestrian accident. The same technology, the same era — but the boundaries were drawn differently. It was not a technological failure. It was a failure of trust.
Trust. Relational transactions that require a human guarantor. In Chapter 6 we saw Kim Su-jin discover a value only a human could read in an AI-rejected loan case — this is that domain.
What the FICO credit score replaced was not trust — it replaced routine tasks. "Thin-file" applicants — young people, immigrants, the self-employed, women who stepped out of the workforce — numbering approximately 3 million to 4 million in Korea, lack the data for algorithmic assessment and are systematically excluded.
The robo-advisor market has grown, but when entrusting ₩10 billion, what the client seeks is not an algorithm — it is an answer to the question "Does this person have the will to protect my money."
As the magnitude of the stakes rises, the value of personal trust rises steeply. Small transfers go to apps; inheritance goes to a person. The price of trust is set at the boundary between routine and exception.
Care. The domain in which the relationship itself is the service. What Choi Eun-jeong does.
According to Nel Noddings's care ethics, care is complete when the one being cared for perceives "this being is caring for me." The AI therapy chatbot Woebot can deliver cognitive-behavioral therapy techniques. USC's virtual interview system Ellie drew more candid responses than human interviewers in PTSD screening.
Yet most Woebot users do not feel that Woebot "is caring for them." They feel they are using a useful tool. Tool and care are different.
Ellie's case is more subtle. That soldiers were more candid with an AI was not because AI cares better — it was because the cost of showing vulnerability to a human is too high. The solution is not AI; it is building human relationships where showing vulnerability is safe.
Meaning. The attribution of meaning arising from shared finitude.
What AI art displaces is the commercial sub-function of art — stock images, background music, advertising copy. The U.S. Copyright Office refused copyright registration for images generated purely by AI. The core rationale was not that AI quality was insufficient — it was the institutional premise that copyright presupposes "a human author."
The phenomenon of "this was made by a human" commanding an economic premium is already visible. The Etsy handmade market holds steady at roughly $13 billion annually even as AI-generated content explodes. What consumers purchase is not function but the narrative — "a finite human spent time to make this."
Viktor Frankl located the ultimate source of meaning in the attitude a finite being chooses in the face of suffering. AI does not suffer. Therefore, in Frankl's frame, AI can be an instrument of meaning but cannot be a source of it.
These four domains are not partitioned — they overlap. A physician's diagnosis is simultaneously judgment and trust; hospice nursing is simultaneously care and meaning. A teacher's role is both the transmission of knowledge and the attribution of meaning through a relationship of trust. A lawyer's argument rests simultaneously on judgment and on a trust relationship with the client.
The more the four domains overlap, the stronger the resistance to displacement. Hannah Arendt distinguished human activity as labor, work, and action. Labor is biological repetition for survival; work produces durable things; action is the capacity to begin something among human beings. What AI displaces belongs to the domains of labor and work. Action — disclosing oneself before others, speaking, making promises, forgiving — presupposes plurality. It requires different, finite beings to be present together.
This may be why religious leaders have survived every technological transition. Theirs is one of the rare occupations in which judgment, trust, care, and meaning all converge.
8. The Threshold Question
We return to Choi Eun-jeong's dawn.
7 a.m. The grandmother from Room 302 has come down to the dining hall. She is wearing indoor shoes instead of slippers. Choi Eun-jeong ladles out a bowl and brings it over. Pumpkin porridge — danhobakjuk. She knows it is what the grandmother likes. Not because of diet preferences recorded in a file. Because two autumns ago, the grandmother ate a bowl of pumpkin porridge and said, "My mother used to make this."
The grandmother pats the back of Choi Eun-jeong's hand.
"Thank you, Eun-jeong."
A day the name is remembered. There are days when it is not. But whether or not the name is there, the grandmother knows that this person is caring for her. The completion of care that Noddings described happens in this dining hall, over one bowl of porridge.
₩12,000 per hour. This number is society's answer to Choi Eun-jeong's labor. Society also knows the answer is wrong. But correcting the answer is not a matter of changing the number — it is a matter of changing the value-measurement system itself. That has not happened yet.
Choi Eun-jeong is the person most faithful to care — and she is held inside the structural undervaluation of ₩12,000 per hour. As Lee Jung-hoon's 28 years of accumulated skill were classified as "legacy data," Choi Eun-jeong's seven years of relational knowledge are reduced to a single line on the long-term care benefit schedule.
Acknowledging this contradiction directly must come before any prescription.
Before offering solutions, the scale of the problem must be faced.
While AI dismantles the scarcity of cognitive labor, the scarcity of judgment, trust, care, and meaning is being revealed. Scarcity does not disappear. It moves. Reading the direction of that movement is strategy for the individual, and coordinates for society.
The next chapter's story begins with Kim Su-jin (44). From KB Kookmin Bank's Gangnam Branch drawer Kim Su-jin has moved to the fintech office — and discovers the value of human judgment that does not appear on the balance sheet. Under the name invisible assets.
"What in your occupation can AI not replace? Is it a question of capability — or a question of trust?"