How Difficult is it to Prove GPT-2? Lagrange Equipped DeepProve with All the Skills
Don't think that proving GPT-2 with DeepProve-1 is an easy task; it combines cryptography, engineering, and AI all into one.
First, we need to solve the nonlinear problem of model structure; LLMs like GPT-2 consist of complex computation graphs, with residual connections and parallel branches. DeepProve specifically added support for arbitrary graph structures, handling anything from multiple input layers to branching outputs, and can also straighten out the twists in ONNX and GGUF files.
The most practical addition is support for the GGUF format, which is a commonly used LLM format by developers on Hugging Face, so there's no need for custom exports in the future; you can use community models directly to verify with DeepProve.