How do you get LangServe to correctly infer the input schema for FastAPI’s automatically generated API docs? It seems like it always documents that the input schema is just a simple string. FastAPI can correctly infer an input schema based on pydantic types, but once you introduce LangServe (which should know how to extract the input schema from the runnable), it stops working.
Here is a basic setup to demonstrate the problem:
from fastapi import FastAPI
from langserve import add_routes
from pydantic import BaseModel
from langchain_core.runnables import chain
app = FastAPI(
title="LLM API"
)
class TestInputs(BaseModel):
foo: str
bar: str
class Config:
arbitrary_types_allowed = True
@chain
def test_chain(inputs: TestInputs):
return f"{inputs.foo} {inputs.bar}"
add_routes(app, test_chain, path="/test-chain")
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)
In this case, it won’t even boot. You get a runtime error:
raise RuntimeError(f'no validator found for {type_}, see `arbitrary_types_allowed` in Config')
RuntimeError: no validator found for <class '__main__.TestInputs'>, see `arbitrary_types_allowed` in Config