Close Menu
AI News TodayAI News Today

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Health-Tracking Pet Collar Acts Like a Smartwatch for Dogs and Cats

    AI Agents Need Their Own Desk, and Git Worktrees Give Them One

    The App Store is booming again, and AI may be why

    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    Facebook X (Twitter) Instagram Pinterest Vimeo
    AI News TodayAI News Today
    • Home
    • Shop
    • AI News
    • AI Reviews
    • AI Tools
    • AI Tutorials
    • Chatbots
    • Free AI Tools
    AI News TodayAI News Today
    Home»AI Tutorials»Liberate your OpenClaw
    AI Tutorials

    Liberate your OpenClaw

    By No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Liberate your OpenClaw
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Anthropic is limiting access to Claude models in open agent platforms for Pro/Max subscribers. Don’t worry though, there are great open models on Hugging Face to keep your agents running! Most of the time, at a fraction of the cost.

    If you’ve been cut off and your OpenClaw, Pi, or Open Code agents need resuscitation, you can move them to open models in two ways:

    1. Use an open model served through Hugging Face Inference Providers.
    2. Run a fully local open model on your own hardware.

    The hosted route is the fastest way back to a capable agent. The local route is the right fit if you want privacy, zero API costs, and full control.

    To do so, just tell your claude code, your cursor or your favorite agent: help me move my OpenClaw agents to Hugging Face models, and link this page.



    Hugging Face Inference Providers

    Hugging Face inference providers is an open platform that routes to providers of open source models. It’s the right choice if you want the best models or you don’t have the necessary hardware.

    First, you’ll need to create a token here. Then you can add that token to openclaw like so:

    openclaw onboard --auth-choice huggingface-api-key
    

    Paste your Hugging Face token when prompted, and you’ll be asked to select a model.

    We’d recommend GLM-5 because of its excellent Terminal Bench scores, but there are thousands to chose from here.

    You can update your Hugging Face model at any time entering its repo_id in the OpenClaw config:

    {
      agents: {
        defaults: {
          model: {
            primary: "huggingface/zai-org/GLM-5:fastest"
          }
        }
      }
    }
    

    Note: HF PRO subscribers get $2 free credits each month which applies to Inference Providers usage, learn more here.



    Local Setup

    Running models locally gives you full privacy, zero API costs, and the ability to experiment without rate limits.

    Install Llama.cpp, a fully open source library for low resource inference.

    # on mac or linux
    brew install llama.cpp
    
    # on windows
    winget install llama.cpp
    

    Start a local server with a built-in web UI:

    llama-server -hf unsloth/Qwen3.5-35B-A3B-GGUF:UD-Q4_K_XL
    

    Here, we’re using Qwen3.5-35B-A3B, which works great with 32GB of RAM. If you have different requirements, please check out the hardware compatibility for the model you’re interested in. There are thousands to choose from.

    If you load the GGUF in llama.cpp, use an OpenClaw config like this:

    openclaw onboard --non-interactive                                                                                    
       --auth-choice custom-api-key                                                                                          
       --custom-base-url "http://127.0.0.1:8080/v1"                                                                          
       --custom-model-id "unsloth-qwen3.5-35b-a3b-gguf"                                                                      
       --custom-api-key "llama.cpp"                                                                                          
       --secret-input-mode plaintext                                                                                         
       --custom-compatibility openai
    

    Verify the server is running and the model is loaded:

    curl http://127.0.0.1:8080/v1/models
    



    Which path should you choose?

    Use Hugging Face Inference Providers if you want the quickest path back to a capable OpenClaw agent. Use llama.cpp if you want privacy, full local control, and no API bill.

    Either way, you do not need a closed hosted model to get OpenClaw back on its feet!

    Liberate OpenClaw
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleOppo made the best foldable phone, again
    Next Article From Moon hotels to cattle herding: 8 startups investors chased at YC Demo Day
    • Website

    Related Posts

    AI Tutorials

    Building a Fast Multilingual OCR Model with Synthetic Data

    AI Tutorials

    7 summer travel tips with help from Google

    AI Tutorials

    Qwen3.6-35B-A3B on my laptop drew me a better pelican than Claude Opus 4.7

    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Health-Tracking Pet Collar Acts Like a Smartwatch for Dogs and Cats

    0 Views

    AI Agents Need Their Own Desk, and Git Worktrees Give Them One

    0 Views

    The App Store is booming again, and AI may be why

    0 Views
    Stay In Touch
    • Facebook
    • YouTube
    • TikTok
    • WhatsApp
    • Twitter
    • Instagram
    Latest Reviews
    AI Tutorials

    Quantization from the ground up

    AI Tools

    David Sacks is done as AI czar — here’s what he’s doing instead

    AI Reviews

    Judge sides with Anthropic to temporarily block the Pentagon’s ban

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Most Popular

    Health-Tracking Pet Collar Acts Like a Smartwatch for Dogs and Cats

    0 Views

    AI Agents Need Their Own Desk, and Git Worktrees Give Them One

    0 Views

    The App Store is booming again, and AI may be why

    0 Views
    Our Picks

    Quantization from the ground up

    David Sacks is done as AI czar — here’s what he’s doing instead

    Judge sides with Anthropic to temporarily block the Pentagon’s ban

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • About Us
    • Contact Us
    • Terms & Conditions
    • Privacy Policy
    • Disclaimer

    © 2026 ainewstoday.co. All rights reserved. Designed by DD.

    Type above and press Enter to search. Press Esc to cancel.