Ch

Project’s GitHub Repo

About

A year ago, I created Cha, read my original blog post about it, my Python CLI tool for interfacing with OpenAI’s models. Since then, I’ve watched the AI landscape evolve dramatically. New players have entered the market, offering compelling alternatives to OpenAI. This led me to re-imagine what Cha could be. The result is Ch, an experimental Go implementation that embodies everything I’ve learned about building developer tools.

Ch isn’t just a port of Cha. While it’s still in its early stages, it focuses on what matters most to me and other developers: speed, efficiency, and support for multiple AI platforms. While keeping the core philosophy of simplicity and terminal-first interaction that made Cha useful, Ch delivers an impressive 2.55x performance improvement over its Python predecessor. This means less time waiting and more time actually solving problems.

The Evolution

The AI landscape has changed significantly since I first released Cha. I’ve seen several major shifts that influenced how I approached building Ch:

  • The rise of new AI providers has been incredible. Companies like Groq have pushed the boundaries of inference speed. Anthropic’s Claude has shown impressive reasoning capabilities. DeepSeek and others have brought fresh approaches to language models. This diversification meant that being tied to just OpenAI wasn’t enough anymore.

  • Response speed has become increasingly critical. As AI tools become part of our daily workflow, those extra seconds waiting for responses add up. This was one of my main motivations for rewriting in Go. The performance gains aren’t just numbers on a benchmark. They translate to a noticeably smoother experience when you’re deep in a coding session.

  • The need for platform flexibility has grown too. Different models excel at different tasks, and having the freedom to switch between them easily is valuable. I wanted Ch to make this seamless, so you can focus on your work rather than managing API endpoints.

Key Features

Multi platform Support: Ch works seamlessly with OpenAI, Groq, DeepSeek, Anthropic, and xAI. I’ve made switching between platforms as simple as possible because I believe having options makes the tool more valuable.

Blazing Fast Performance: The 2.55x speed improvement over Cha isn’t just marketing. It’s the result of careful optimization and Go’s excellent performance characteristics. Every interaction feels snappier, which makes a real difference when you’re using it throughout your day.

Interactive & Direct Modes: Sometimes you want a quick answer, other times you need an extended conversation. Ch supports both workflows naturally. You can fire off quick queries or engage in detailed technical discussions.

Web Search Integration: I’ve integrated SearXNG with IEEE citation format. This means when Ch pulls in web content to answer your questions, you get properly cited, research grade responses. It’s particularly useful when you need up to date information or want to verify claims.

Smart File Handling: Loading files into your chat context is something I use constantly, so I made it better. The multi select functionality makes it easy to include exactly what you need in your conversation.

Professional Tools: Whether you’re exporting conversations for documentation, using your preferred text editor for complex prompts, or switching between AI models, Ch makes it straightforward. These aren’t just features I thought would be nice. They’re tools I use every day in my own work.

Chat History Management: Being able to backtrack through conversation history has saved me countless times when I need to reference earlier parts of a discussion or export chats for future reference.

Why Go?

The decision to rewrite Cha in Go wasn’t just about performance. After a year of maintaining Cha, I had a clear picture of what worked and what could be better. Go’s strong typing caught errors earlier in development. Its excellent concurrency support made handling multiple API calls smoother. The fast execution made every interaction feel more responsive.

But perhaps most importantly, Go helped me build a more robust and maintainable tool. The code is cleaner, the error handling is more reliable, and the overall architecture is more solid. These improvements might not be immediately visible to users, but they make Ch more reliable and easier to extend with new features.

The Power of AI-Assisted Development

What truly amazed me about building Ch wasn’t just the performance improvements or the new features. It was how I built it. Using tools like Claude Code CLI and Gemini CLI, combined with the Cursor IDE, I was able to develop this MVP in less than a day. This experience completely changed my perspective on what’s possible in software development.

This rapid development cycle wasn’t about cutting corners. Instead, it demonstrated how AI tools are transforming the way we can approach software projects. What might have taken weeks of planning, coding, and debugging was condensed into hours of focused development. This isn’t just about writing code faster; it’s about being able to experiment, iterate, and innovate at a pace that wasn’t possible before.

Looking Forward

While Ch currently implements most of Cha’s core features, it’s very much an experimental project. I’m excited about its potential, but there’s still work to be done. The performance improvements and multi platform support position Ch to grow alongside the rapidly evolving AI landscape.

I use Ch daily, just as I did with Cha, but now with the satisfaction of knowing it’s faster. For those interested in trying it out, check out the project’s GitHub repository linked at the top of this post. The installation process is straightforward, especially if you’re familiar with Go tools.

The journey from Cha to Ch has been about more than just rewriting a tool in a faster language. It’s been about taking everything I learned from building and using Cha, and creating something that better serves the needs of developers in today’s AI landscape. I’m excited to see how people use Ch and how it can evolve to meet future needs.