What are the Slack Archives?

It’s a history of our time together in the Slack Community! There’s a ton of knowledge in here, so feel free to search through the archives for a possible answer to your question.

Because this space is not active, you won’t be able to create a new post or comment here. If you have a question or want to start a discussion about something, head over to our categories and pick one to post in! You can always refer back to a post from Slack Archives if needed; just copy the link to use it as a reference..

Hi Everyone, I have a large number of data in the database and that's why I am fetching the data in

U040LMJF6TY
U040LMJF6TY Posts: 30 🧑🏻‍🚀 - Cadet

Hi Everyone,

I have a large number of data in the database and that's why I am fetching the data in chunks using limit and offset, with do while loop.
But the strange thing is that each loop is increasing my memory usage, where I am setting my all defined variables to null and cleaning the garbage collection.

What else I can do to reset my memory to the initial stage??
@UL6DGRULR

Comments

  • Alberto Reyer
    Alberto Reyer Lead Spryker Solution Architect / Technical Director Posts: 690 🪐 - Explorer

    Have a look into \Propel\Runtime\Propel::disableInstancePooling

    You probably use the propel ORM to fetch your entries and per default each entity that is fetched is cached during the request so it needs to be fetched only once.
    But especially when you fetch a lot of entities during the same request (e.g.: during imports/publish/sync) this is hurting more then it helps.
    You can just call the method above at the start of your batch process and be fine.

  • U040LMJF6TY
    U040LMJF6TY Posts: 30 🧑🏻‍🚀 - Cadet

    Thanks that worked for me.

  • U040LMJF6TY
    U040LMJF6TY Posts: 30 🧑🏻‍🚀 - Cadet

    @UL6DGRULR is there any similar method for the Redis storage??