My application has multiple scripts using a
SQLAlchemy. The load varies a lot and it could happen that these scripts are trying to read, write, update and delete at the same time.
To share the
Session I’ve one file to configure the database:
# db.py engine = sqlalchemy.create_engine(...) Session = sessionmaker(bind=engine)
Every script that needs to work with the DB, imports
Session and creates a new one via
session = Session():
# script_a.py from db import Session session = Session() try: session.query(...).delete() session.commit() except Exception as e: session.rollback() finally: session.close()
# script_b.py from db import Session session = Session() try: session.add(...) session.commit() except Exception as e: session.rollback() finally: session.close()
It works so far but here comes the problem: I just read in the docs that my approach is a bad practice, so I’m looking for the best practice.
My current idea is to add
session = Session() to
db.py and import
session instead of
I don’t know about the impact, especially in case of failure (rollback) and from my understanding
.close() would close the global
session which is used by a few scripts in parallel.
What’s the proper way of using the session in my case, any idea?
Thanks in advance!