An Analogy for Understanding What it Means for Generative AI to be “Open”

I’ve found this to be a useful analogy for understanding what it means for generative AI to be “open”:

  • Operating systems can be proprietary or open source.
  • Device drivers and other kernel modules that change the behavior of the operating system can be proprietary or open source.
  • Software programs that run on an operating system can be proprietary or open source.
  • Foundation models are like operating systems.
  • Fine-tuning a foundation model is like writing a device driver or other kernel module.
  • Prompts are like the software programs that run on the operating system.
  • Foundation models can be proprietary or open source.
  • The modified model weights that are the result of the fine-tuning process can be proprietary or open source.
  • The specific wording of prompts – their “source code” – can be proprietary or open source.

When people talk about whether or not generative AI should be “open,” they could be talking about whether the foundation models should be open, whether the modified model weights that result from fine-tuning should be open, and/or whether the prompts (which includes templates, embeddings, etc.) should be open.