“Yeah other people, it’s gonna be more difficult someday to verify OpenClaw nonetheless works with Anthropic fashions,” OpenClaw writer Peter Steinberger posted on X early Friday morning, along side a photograph of a message from Anthropic pronouncing his account have been suspended over “suspicious” task.
The ban didn’t remaining lengthy. A couple of hours later, after the publish went viral, Steinberger mentioned his account have been reinstated. Amongst masses of feedback — a lot of them in conspiracy principle land, for the reason that Steinberger is now hired by way of Anthropic rival OpenAI — was once one by way of an Anthropic engineer. The engineer advised the famed developer that Anthropic hasn’t ever banned any individual for the use of OpenClaw and introduced to lend a hand.
It’s now not transparent if that was once the important thing that restored the account. (We’ve requested Anthropic about it.) However the entire message string was once enlightening on many ranges.
To recap the new historical past: This ban adopted information remaining week that subscriptions to Anthropic’s Claude would not duvet “third-party harnesses together with OpenClaw,” the AI fashion corporate mentioned.
OpenClaw customers now must pay for that utilization one at a time, in line with intake, via Claude’s API. In essence, Anthropic, which provides its personal agent, Cowork, is now charging a “claw tax.” Steinberger mentioned he was once following this new rule and the use of his API however was once banned anyway.
Anthropic mentioned it instituted the pricing trade as a result of subscriptions weren’t constructed to take care of the “utilization patterns” of claws. Claws can also be extra compute-intensive than activates or easy scripts as a result of they’ll run steady reasoning loops, mechanically repeat or retry duties, and tie into a large number of different third-party gear.
Steinberger, then again, wasn’t purchasing that excuse. After Anthropic modified the pricing, he posted, “Humorous how timings fit up, first they replica some in style options into their closed harness, then they lock out open supply.” Although he didn’t specify, he could have been relating to options added to Claude’s Cowork agent, similar to Claude Dispatch, which shall we customers remotely keep watch over brokers and assign duties. Dispatch rolled out a few weeks sooner than Anthropic modified its OpenClaw pricing coverage.
Steinberger’s frustration with Anthropic was once once more on show Friday.
One particular person implied that a few of that is on him for taking a task at OpenAI as a substitute of Anthropic, posting, “You had the selection, however you went to the mistaken one.” To which Steinberger answered: “One welcomed me, one despatched prison threats.”
Ouch.
When more than one other folks requested him why he’s the use of Claude as a substitute of his employer’s fashions in any respect, he defined that he handiest makes use of it for checking out, to verify updates to OpenClaw received’t smash issues for Claude customers.
He defined: “You wish to have to split two issues. My paintings on the OpenClaw Basis the place we wanna make OpenClaw paintings nice for *any* fashion supplier, and my process at OpenAI to lend a hand them with long term product technique.”
A couple of other folks additionally identified that the wish to check Claude is as a result of that fashion stays a well-liked selection for OpenClaw customers over ChatGPT. He additionally heard that once Anthropic modified its pricing, to which he answered: “Running on that.” (So, that’s a clue about what his process at OpenAI involves.)
Steinberger didn’t reply to a request for remark.



