Open Source on Hugging Face

IQuest Coder

State-of-the-art open-source code LLM for autonomous software engineering. Built with the innovative Code-Flow training paradigm to understand real-world code evolution.

76.2%
SWE-Bench Verified
View Results
81.1%
LiveCodeBench v6
View Results
49.9%
BigCodeBench
View Results
128K
Context Length
Learn More

Built for Code Intelligence

Advancing autonomous software engineering with innovative training paradigms and efficient architectures.

Code-Flow Training

Moving beyond static code representations, our models learn from repository evolution patterns, commit transitions, and dynamic code transformations to understand real-world software development processes.

Dual Specialization

Bifurcated post-training delivers two specialized variants: Thinking models with reasoning-driven RL for complex problem-solving, and Instruct models optimized for general coding assistance.

Loop Architecture

The Loop variant introduces a recurrent mechanism with shared parameters across iterations, optimizing the trade-off between model capacity and deployment footprint.

Native Long Context

All models natively support up to 128K tokens without requiring additional scaling techniques, enabling processing of entire codebases and multi-file contexts.

Model Family

Choose from multiple model sizes with both Instruct and Thinking variants.

Model Parameters Context Variant Link
IQuest-Coder-V1-7B-Instruct 7B 128K Instruct Hugging Face
IQuest-Coder-V1-14B-Instruct 14B 128K Instruct Hugging Face
IQuest-Coder-V1-40B-Instruct 40B 128K Instruct Hugging Face
IQuest-Coder-V1-40B-Thinking 40B 128K Thinking Hugging Face
IQuest-Coder-V1-40B-Loop-Instruct 40B 128K Loop Hugging Face

Quick Start

Get started with IQuest Coder using Hugging Face Transformers.

Python transformers >= 4.52.4
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "IQuestLab/IQuest-Coder-V1-40B-Instruct"

# Load the tokenizer and model
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype="auto",
    device_map="auto"
)

# Prepare the input
prompt = "Write a Python function to calculate Fibonacci sequence."
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)

# Generate response
output = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(output[0], skip_special_tokens=True))

Research & Publications

Peer-reviewed research underpinning IQuest Coder's innovations.

Technical Report

IQuest-Coder-V1 Technical Report

IQuest Coder Team, 2025

Read Paper
arXiv:2512.13472

Scaling Laws for Code: Every Programming Language Matters

Yang et al., 2025

Read Paper
arXiv:2512.23611

Close the Loop: Synthesizing Infinite Tool-Use Data

Li et al., 2025

Read Paper
arXiv:2512.22087

LoopCoder: Scaling Code Intelligence via Looped Language Models

Yang et al., 2025

Read Paper
arXiv:2512.22087

Context as a Tool: Context Management for Long-Horizon SWE-Agents

Liu et al., 2025

Read Paper