image info

In this article, we will explore how to use Continue.dev, Ollama, Codestral, and Starcoder to create a powerful AI assistant for code writing and autocompletion, all running locally.

This setup ensures data privacy while leveraging advanced language models to optimize your development workflow.

πŸ€– What is a Code Agent and AI Autocompletion?

A code agent is an AI-powered assistant designed to help developers understand, write, and debug code. AI autocompletion goes beyond basic suggestions by offering intelligent, context-aware solutions.

This approach can significantly improve productivity, reduce errors, and accelerate project timelines.

🌟 Overview of Continue.dev, Ollama, Codestral, and Starcoder

  • Continue.dev: A Visual Studio Code extension that integrates interactive chat and code suggestions. It allows developers to interact with their codebase naturally and supports multiple AI models.

  • Ollama: A platform enabling large language models to run directly on your machine, ensuring better data privacy and independence. Ollama acts as the backbone for hosting models locally.

  • Codestral: Ideal for precision tasks, Codestral (22b model) specializes in generating precise and highly contextual code snippets. Although slightly slower due to its size, it is perfect for tasks requiring accuracy.

  • Starcoder: Known for its speed, Starcoder (3b model) is optimized for autocompletion tasks, offering lightweight and efficient performance for real-time coding.

By combining these tools, you can tailor your development workflow to balance speed and accuracy depending on the task.

πŸš€ Setting Up Continue.dev with Ollama

To connect Continue.dev to Ollama, we will modify its JSON configuration file. Follow these simple steps:

  1. Install Ollama:
    Download and install Ollama by following this guide.

  2. Install Models:
    Download and install the Ollama models thanks to the command line:

     ollama pull codestral:22b
     ollama pull starcoder2:3b
  3. Configure Continue.dev:
    Add or update the continue.json file to integrate Ollama as the default model:

    {
         "models": [
             {
             "title": "Codestral",
             "provider": "ollama",
             "model": "codestral:22b"
             }
         ],
    
         "tabAutocompleteModel": {
             "title": "StarCoder",
             "model": "starcoder2:3b",
             "provider": "ollama"
         }
    }
  4. Start Ollama:
    Run the Ollama server:

    ollama start
  5. Restart Visual Studio Code:
    Activate the Continue.dev extension and ensure the model is properly detected.

πŸ§‘β€πŸ’» Using the Continue.dev Extension

Interacting with the model through Continue.dev is intuitive. It allows developers to ask questions, request code suggestions, or review entire files and codebases with ease. Here’s a breakdown of its key features:

Codestral: Code Generation with Chat

Codestral provides two primary modes designed to enhance your development experience:

πŸ“ Edit the Selection

Use natural language prompts to edit a specific code snippet directly with cmd+K/Ctrl+K. Whether you need to refactor, optimize, translate, or debug code, Codestral seamlessly adapts to your requirements.

Example in Action: Code refactoring in action with a natural language prompt:

Using Codestral Edit

πŸ’¬ Chat with Your File or Codebase

Interact contextually with individual files or your entire codebase with cmd+L/Ctrl+L. This mode allows you to generate new code, optimize existing code, or debug functions with detailed, file-specific guidance. You can dive into complex functions or even build entirely new features.

Example in Action: A developer chatting with their codebase to create an optimized algorithm, with context provided by the selected file or multiple files:

Using Codestral Chat

Key Features:

  • File Context: Ask specific questions about a file, e.g., β€œHow can I refactor the function processData in @utils.js?”
  • Codebase Navigation: Explore and query across multiple files, e.g., β€œCan you explain how @auth.js interacts with @db.js in the login workflow?”
  • Debugging: Highlight and resolve issues within the context of a specific file.

Starcoder: Code Autocompletion with Tab

Starcoder offers a unique mode to streamline code creation through intelligent autocompletion:

✏ Autocomplete Your Code

Interact with a file or your broader codebase to complete unfinished code, suggest improvements, or debug. Starcoder enhances productivity with intelligent suggestions tailored to the specific file or the overarching project.

Example in Action: Starcoder completing functions based on the context of the file:

Using Starcoder Autocomplete

Key Features:

  • Seamless Integration: Autocomplete code based on surrounding logic in the current file.
  • Smart Suggestions: Tailored to the file’s context, with an understanding of related files for cross-references.
  • Rapid Prototyping: Complete boilerplate code or suggest advanced solutions in seconds.

πŸ› οΈ Creating a Custom Command

To further personalize your environment, you can create internal commands through the extension. Here’s an example to generate function templates:

  1. Add a Command:
    Open continue.json and configure a new action:

     "customCommands": [
         {
         "name": "test",
         "prompt": "{{{ input }}}\n\nWrite a comprehensive set of unit tests for the selected code. It should setup, run tests that check for correctness including important edge cases, and teardown. Ensure that the tests are complete and sophisticated. Give the tests just as chat output, don't edit any file.",
         "description": "Write unit tests for highlighted code"
         }
     ]
  2. Test the Command (cmd+L/Ctrl+L):
    Test a Command

πŸ“ˆ Extending the Capabilities

To go further, you can:

  • Add multiple models with Ollama to handle specific tasks.
  • Integrate workflows through other extensions like GitLens or Docker.
  • Share your custom actions with your team.

πŸŽ‰ Conclusion

By combining Continue.dev, Ollama, Codestral, and Starcoder, you can build a powerful and fully local AI assistant for coding. This setup ensures optimal privacy and incredible flexibility.

Feel free to test and experiment with these tools to revolutionize your coding experience!