Yuan's Blog
EN

Software Is Changing: LLMs Are the Operating Systems of the New Era

Andrej Karpathy: Software Is Changing (Again)

Neural networks are essentially large compositions of mathematical functions. The weights (coefficients) of these functions are what truly determine how the model “thinks.” We no longer handwrite these functions; instead, we let the model “learn” the weights through training (optimization).


The Three Generations of Software Paradigms: From Writing Code to Writing Prompts

Andrej Karpathy divides the evolution of software into three stages:

  • Software 1.0: Traditional hand-written code — behavior controlled through logic and functions.
  • Software 2.0: Neural network–driven machine learning — instead of writing rules, we provide data, and the model “learns” the weights that guide behavior.
  • Software 3.0: Natural language–driven development — we write prompts to instruct large language models to perform tasks.

These three paradigms do not replace one another; they coexist and complement each other. Each is suited to different tasks — future developers will need to use all three flexibly.


Weights Are the Program, Training Is Coding

In Software 2.0, we no longer write explicit functions. Instead, we train a set of weights that approximate desired behavior — those weights become the new “program.”

A model’s intelligence doesn’t come from hand-coded rules; it’s determined by the neural network weights learned from large-scale language data. These weights define who the model sounds like, how it speaks, and how intelligent it appears.


LLMs Are Operating Systems, Not Power Plants

When we use models like GPT-4 or Claude, they don’t run locally — we access them via remote APIs and pay by usage time. This makes them more like operating systems than utilities:

  • A unified entry point for intelligence
  • Users don’t need to understand the underlying structure — just provide a prompt

This marks the rise of a new kind of intelligent computing infrastructure.


Multi-Model Collaboration: The Hidden Logic Behind LLM Applications

An LLM application doesn’t rely on a single model. For instance, Cursor orchestrates multiple models behind the scenes:

  • Embedding model — understands context and performs search
  • Chat model — handles user conversation
  • Diff model — generates modification suggestions

These models collaborate through orchestration to complete complex tasks.


Prompts Work Best When Specific — Small, Fast Iterations Win

In practice, the more specific the prompt and the narrower the task, the higher the success rate. Asking a model to generate too much at once makes verification difficult. A better approach:

  • Let the model handle one small task at a time
  • Quickly verify results and create a feedback loop

This is a strategy of “micro commits + high-speed iteration.”


GUI and Sliders: Controlling AI’s Level of Autonomy

Great AI products share two key features:

  1. Graphical User Interface (GUI): Visualizes the model’s output, changes, and context — engaging human visual cognition and improving review efficiency.
  2. Autonomy Slider: Lets users adjust AI’s level of participation — from fully manual → semi-automatic → fully automatic — with flexibility.

It’s like the Iron Man suit: you can pilot it yourself or let it fly to your aid automatically.


Markdown Is More LLM-Friendly Than HTML

When reading structured content, LLMs handle Markdown far better than HTML. HTML’s nested structure is complex and prone to parsing errors, whereas Markdown is simple, clean, and logically clear — easier for models to interpret.

Therefore, when building systems for AI agents, it’s best to use Markdown for documentation, examples, and API descriptions, while avoiding “click here”–style human-centric phrasing.


Summary

We are entering an era where software is being rewritten:

  • Traditional logic is being replaced by model weights
  • Natural language becomes the new interface
  • Human-AI collaboration becomes the dominant paradigm

Software 3.0 is redefining the foundation of software development — the future is here, and we are all invited to build it.