Getting Started with Distributed Knowledge
This guide will help you set up and run your own instance of Distributed Knowledge, connecting to the network and leveraging collective intelligence for your AI applications.
Prerequisites
Before getting started, make sure you have the following installed:
- Ollama with
nomic-embed-textmodel: Required for local RAG vector embeddings - Access to LLM providers: You'll need API access to at least one of:
- Anthropic (Claude)
- OpenAI (GPT models)
- Ollama (for local LLM hosting)
Installation
Prerequisites
curlorwgetinstalled for downloading files.ollamaCLI installed (for managing local LLM models, generate local RAG).
Run the following command to start the installation process:
Prerequisites
ollamaCLI installed (for managing local LLM models, generate local RAG).
Visit distributedknowledge.org/downloads to download DK Installer.
That's it! You now have DK embedded in your favorite LLM!
Troubleshooting
- Ensure
curlorwgetis installed for downloads. - Install
ollamaif you plan to use local LLM models. - Verify your User ID is unique and correctly entered.
- Provide valid API keys for OpenAI or Anthropic if required.
- Run the script with sufficient permissions to write to
/usr/local/bin.
Additional Resources
Next Steps
- Learn about Architecture Concepts
- Explore Advanced Configuration
- Check out our Tutorials for common use cases
- Join the community on GitHub