Skip to content
@QuadtrixLabs

QuadtrixLabs

Open Source

QuadtrixLabs

Open-source projects focused on local language model inference, model architecture research, and systems-level tooling. All projects are written in C++ and Python and are designed to run entirely on local hardware.

Repositories

Repository Description Language
Quadtrix.cpp Inference engine for running language models on local hardware C++

Areas of Work

  • Local inference — running language models without external APIs or cloud services
  • Model implementation — transformer architectures, tokenizers, and training pipelines implemented from source
  • Hardware tooling — CPU, RAM, and GPU monitoring at the OS level
  • Agent systems — task automation combining language model output with system interaction

Contributing

See CONTRIBUTING.md for contribution guidelines, code style requirements, and the pull request process.

To report a security issue, refer to SECURITY.md instead of opening a public issue.

License

Projects in this organization are released under the MIT License unless otherwise noted in the repository.

Pinned Loading

  1. Quadtrix.cpp Quadtrix.cpp Public

    AI engine in C++ /python to run Language Models directly on your own computer.

    C++ 3 2

Repositories

Showing 2 of 2 repositories

Top languages

Loading…

Most used topics

Loading…