Skip to content

Use a ml model loaded in the LIFESPAN function in the main file, in another file (ie: api router) #9234

Answered by jonra1993
pelaezluis asked this question in Questions
Discussion options

You must be logged in to vote

Hello @pelaezluis You can use a global context like this to share your model with other routes.

This sample code can give you an idea

main.py (It is not completed)

from app.utils.fastapi_globals import g, GlobalsMiddleware
from transformers import pipeline

@asynccontextmanager
async def lifespan(app: FastAPI):
    # Startup
    # Load a pre-trained sentiment analysis model
    sentiment_model = pipeline("sentiment-analysis")
    g.set_default("sentiment_model", sentiment_model)
    print("startup fastapi")
    yield
    del sentiment_model
    g.cleanup()
    
app = FastAPI(
    title="Fastapi",
    lifespan=lifespan,
)

app.add_middleware(GlobalsMiddleware)

natural_language.py

from fastapi

Replies: 2 comments 3 replies

Comment options

You must be logged in to vote
1 reply
@pelaezluis
Comment options

Comment options

You must be logged in to vote
2 replies
@pelaezluis
Comment options

@Arnold-git
Comment options

Answer selected by pelaezluis
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Question or problem
4 participants