Skip to content

LLM_Support

pyscripter edited this page Apr 13, 2025 · 4 revisions

Large Language Models

Large Language Models (LLMs) have the potential of transforming the coding experience and increasing programmer productivity to new levels. PyScripter has built-in support for LLM-assisted coding, which comes in two forms.

  1. A coding Assistant available in the Editor
  2. Integrated Chat for interacting with LLM models

Both cloud-based and local LLM are supported:

  • Cloud-based LLM
    • OpenAI models such as (GPT-3.5 Turbo and GPT-4o)
    • Gemini models such as (1.5 Flash and 1.5 Pro) by Google
    • DeepSeek models such as deepseek-chat and deepseek-reasoner
    • Grok models such as grok3 and grok-3-mini
  • Local LLM models (support is provided by using Ollama). Choice of models include:
    • llama3 and codellama by Meta
    • gemma by Google
    • starcoder2 by Nvidia
    • phi and wizardlm by Microsoft
    • and many many others

Support for other cloud-based LLM services such as Claude by Anthropic is planned in future versions of PyScripter

Related Topics

Clone this wiki locally