Guiding Principles
The Smart Ecosystem is built on values that keep users in control.
✔︎Local-first
Local Data
All data is stored as simple JSON in your .smart-env
folder.
TIP
Put it in ChatGPT and see what you can do! The .ajson
files in your .smart-env
folder contain metadata and
embeddings in a format that is easy for ChatGPT (and nerds 🤓) to
parse. See what you can discover. Your exploration might inspire a
future Smart Plugin or even a new component in the Smart Ecosystem!
Local Models
Local AI Embedding model by default.
TIP – Privacy First
Utilize local embedding models. Building an "index" means processing all your data with an embedding model. If the model is not local, then every byte flows to the cloud.
Local Chat Models
Already have Ollama or LM Studio installed? Great! Available models appear automatically in the settings dropdown.
NOTE – Reality Check
Chat models benefit more from quality of context than raw volume. Even if you include many notes in a conversation, those notes usually represent a small slice of your entire vault. Consider the trade-offs before committing to local-only chat models: (1) setup cost can be high, and (2) even on good hardware you may only run models that are weaker than frontier models offered virtually free by reputable providers.
✔︎Open-source
Smart Connections is developed in the open under a permissive license. Contributions and community audits maximize transparency and trust.
✔︎Dependency-free*
A dependency is someone else's library imported to save time. Pre-AI, libraries were a godsend. Post-AI, libraries can be a liability.
TIP – Hot Take
There are countless open-source libraries that do nearly the same thing. Which one is most trustworthy? Vibes are not a strategy. Every dependency drags in more dependencies, leading straight into dependency hell.
The Asterisk
The open-source core, Smart Environment, is designed to stay dependency-free so the same code works everywhere. Yet it is also extendable. Through an adapter pattern, core handlers can be swapped to add new power. For example, the default local embed model uses transformers.js by HuggingFace via an adapter in smart-embed-model.
✔︎Extendable
Build your own AI plugin with Smart Environment. It is simple. The open-source obsidian-smart-env lets you tap into the very same AI runtime powering Smart Connections in just a few lines.
NOTE – Prediction
Some believe apps will be generated on the fly for every micro-moment. We probably do not want that much change. What feels right is each of us having a uniquely evolving interface, continuously tuned to our goals.