I'm developing a web application where at some point, I need to select 1.000.000 rows from my database.
My script looks like this :
engine = create_engine(
"mysql://:@localhost/test",
isolation_level="READ UNCOMMITTED",echo=False
)
meta = MetaData(bind=engine)
meta.reflect(bind=engine)
cr = meta.tables['cr']
bl = meta.tables['bl']
DBSession = scoped_session(
sessionmaker(
autoflush=True,
autocommit=False,
bind=engine
)
)
test_query = session.query(bl,cr).filter(bl.c.severity_logged == '4_minor')
print test_query.all()
It keeps scanning the disc and increasing the memory usage but doesn't show anything.
In MySQL command, the result is returned in 4 seconds. How can I use SQLALchemy to retrieve large amounts of data?