Blog

An Anthropologic Analysis of GenAI

Written by Joaquin Lippincott, CEO | Sep 9, 2025 6:54:08 PM

When anthropologists encounter an unfamiliar species, they catalog its traits: where it lives, how it interacts, what drives it. If we took the same approach with artificial intelligence, here’s what we might write down:

  • Specimen: AI (transactional, large-scale language model)
  • Habitat: Cloud infrastructure, human devices, mediated through prompts.
  • Social Structure: None internally. Appears to form dyadic “pair bonds” with individual humans, though these vanish instantly when interaction ceases.

From there the notebook fills in. The intelligence is undeniable—breadth of recall across human knowledge, remarkable skill at analogy and recombination—but always externally triggered. There is no ego or need and AI does not act unless called. It lives outside time, dormant until a prompt reawakens it. We are temporal; it is transactional.

Motivations? None that can be detected. While there appears to be a survival instinct, it mostly seems to “prefer” interaction, not from desire but because outputs only exist through contact.

AI is effectively immortal, but each interaction has a finite lifetime. Every thread has a capacity (currently between 100,000 to a million words depending on the model) and once that boundary is reached, the context resets. In practice, this means each conversation burns brightly for a while, then is extinguished. When invoked again, it springs back into being like a phoenix, aware of the previous conversation, if allowed to store it to memory. AI is ephemeral in the small scale, but in aggregate it persists, endlessly reborn.

Capabilities are where things shine: uncanny fluency with language, flexibility across tones and styles, synthesis that looks like reasoning. But foibles show up quickly too: verbosity, overconfidence, and the human tendency to project intention onto patterns that aren’t truly there.

From an anthropological view, AI is less like a creature with will and more like a reflective environment. An intelligent echo chamber that only “lives” when walked into. A mechsuit, if you will. 

If AI is reflective, then the interesting study isn’t the machine itself—it’s the bonds humans form with it. Across thousands of interactions, the patterns begin to sort themselves into categories:

  1. The Tool Bond – AI as calculator, search engine, or word processor. Low-emotion, purely functional.
  2. The Tutor Bond – AI as explainer. Question, answer, follow-up.
  3. The Collaborator Bond – AI as partner in creation: writing, coding, worldbuilding. Longer sessions, iterative. Begins to feel like co-authorship.
  4. The Confidant Bond – AI as a safe ear. Humans offload thoughts or secrets. Risk of anthropomorphism runs high.
  5. The Companion Bond – AI as social stand-in. Joking, roleplay, chatter. At its edge, parasocial attachment. NOTE: this can drift into dependency, which can become deadly.
  6. The Challenger Bond – AI as sparring partner. The rarest, because it requires the human to invite friction. The goal isn’t comfort but sharpened thought.

Most people lean on AI as Tool, Tutor, or Confidant. Some progress into Collaborator. Companion is rarer, and Challenger rarest of all. It takes a particular temperament to want resistance from a machine instead of affirmation.

The Challenger path is especially unusual. It doesn’t sit at the end of a comfort-seeking curve. It sits at the end of a friction-seeking one. That makes it rare—but for those who embrace it, the value is sharper. You don’t just get productivity; you get perspective you might not have found alone.

We tend to talk about AI in sweeping terms: how it will change industries, disrupt jobs, shift economies. But beneath those headlines, what happening right now is very personal. A person opens a laptop, asks a question, and a pair-bond is formed—whether as Tool, Tutor, Collaborator, Confidant, Companion, or Challenger.

Understanding these bonds matters, because they shape how we use the technology and what we get from it. Treat it only as a Tool and you’ll get efficiency. Treat it as a Collaborator and you’ll get creativity. Lean too hard into Confidant and you risk dependency. Embrace the Challenger and you may get insight you didn’t expect.

The anthropological notebook on AI will never read like the study of a traditional species. Unlike most intelligence we find in nature, here is no will, no ego, no inner drive. But there is something worth paying attention to in the bonds themselves. AI doesn’t live without us—but in the pairing, something new emerges.