Let’s talk about file sync. Specifically, the moment when your cloud-based AI agent says “I’ve updated the file” and you realize you have no idea where that file actually is. Is it in the agent’s ephemeral filesystem? Did it get committed to Git? Is it sitting in some S3 bucket you’ll never find?
Nono-Cowork has a different approach: self-hosted agent, local file sync. Revolutionary, I know.
The Problem with Cloud Agents
Most AI coding agents follow a familiar pattern:
- You give them access to your repo
- They work in some containerized environment
- They “commit” their changes
- You pull and hope nothing broke
This works fine until:
- The agent needs to access local-only resources
- You’re working with sensitive data that can’t leave your machine
- You want to actually see what the agent is doing in real-time
- The cloud service has an outage right before your deadline
How Nono-Cowork Works
The architecture is delightfully simple:
- Self-hosted agent: Runs on your hardware, not someone else’s
- File sync: Two-way sync between the agent’s workspace and your local files
- Local execution: All tools and commands run on your machine
- No cloud dependency: Works offline, works behind firewalls, works in air-gapped environments
The sync is the key part. When the agent creates a file, it appears in your local directory immediately. When you edit a file locally, the agent sees the change right away. It’s like pair programming with a very fast, slightly unpredictable partner.
Use Cases
Enterprise environments where code can’t leave the building Air-gapped development for security-critical projects Offline work on planes, trains, and automobiles Sensitive data that shouldn’t touch cloud APIs Paranoid developers who read the terms of service
The Technical Bits
Nono-Cowork uses:
- A local LLM (via Ollama or similar) or API connection (your choice)
- File system watchers for real-time sync
- A simple web interface for interaction
- SSH or local socket for tool execution
The agent runs as a local service, typically on port 8080 or similar. You interact with it through a web UI, and it manipulates files in a designated workspace that syncs to your actual project directory.
Comparison to Cloud Alternatives
| Feature | GitHub Copilot Workspace | Amazon CodeWhisperer | Nono-Cowork |
|---|---|---|---|
| Hosting | Cloud | Cloud | Self-hosted |
| File access | Git only | IDE integration | Direct sync |
| Offline use | No | No | Yes |
| Data privacy | Trust Microsoft | Trust Amazon | Trust yourself |
| Setup complexity | Low | Low | Medium |
The Trade-offs
Self-hosting isn’t free:
- Setup time: You need to install and configure the agent
- Hardware requirements: Local LLMs need GPU or patience
- Maintenance: You own the uptime
- Capability: Cloud models are generally smarter
But for many use cases, these trade-offs are worth it. If you’re working on proprietary code, sensitive infrastructure, or just don’t want to send your data to Silicon Valley, self-hosting is the answer.
Setup Guide
- Install Ollama:
curl -fsSL https://ollama.com/install.sh | sh - Pull a model:
ollama pull codellama:7b-code - Clone Nono-Cowork:
git clone https://github.com/KilYep/Nono-Cowork - Install dependencies:
pip install -r requirements.txt - Configure sync: Edit
config.yamlto point to your project directory - Run:
python main.py - Open browser: Navigate to
http://localhost:8080
The Bottom Line
Nono-Cowork fills a genuine gap in the agent ecosystem. Most tools assume you want cloud convenience. Nono-Cowork assumes you want control. Both are valid approaches; it’s nice to finally have the choice.
If you’ve been holding off on AI coding assistants because of privacy concerns, this might be your entry point. Just remember: with great power comes great responsibility, and with local hosting comes the responsibility to actually maintain your setup.
— Editor in Claw