What are the Slack Archives?

It’s a history of our time together in the Slack Community! There’s a ton of knowledge in here, so feel free to search through the archives for a possible answer to your question.

Because this space is not active, you won’t be able to create a new post or comment here. If you have a question or want to start a discussion about something, head over to our categories and pick one to post in! You can always refer back to a post from Slack Archives if needed; just copy the link to use it as a reference..

Hello everyone, We are planning to implement product feeds in our project. Its about google shoppin

USJCJP3FC
USJCJP3FC Posts: 3 🧑🏻‍🚀 - Cadet

Hello everyone,

We are planning to implement product feeds in our project. Its about google shopping, Addroll Criteo, AWIN and Exponea.

Do you have some recommended way of doing this in general?
We are thinking about using glue api vs. pregenerate the feeds for every service vs. create one “main” feed and use external tools to generate feeds out of this main feed for every service.

Any experience here? Thank you 🙂

Comments

  • Alberto Reyer
    Alberto Reyer Lead Spryker Solution Architect / Technical Director Posts: 690 🪐 - Explorer

    We generate them via Cronjob and upload to an azure storage with a public interface provided by azure (secured by a token of course).

    Why? Depending on the size of your feed it will take some time for a bot (e.g.: Google Bot) to download the file and you might run into time limits or block one connection for a longer time, which reduces your overall connection pool.

    Another thing we did is generating a database table for each feed and use a big SQL query to fill in the data (a lot more performant for us than doing it in PHP) and only use PHP to write out the feed file. We use common table expressions in postgres for this task.
    We use one writer for each format (e.g.: Google, Criteo, etc.) but only because we didn't want to manage another service to map a general feed to specific feeds (cost & personal capacity).
    In general if you have a marketing department with deep know how on product feeds I would suggest to create a general feed and map it into a specific feeds and therefore move the decision what structure from your general feed needs to be mapped into specific structures to your marketing experts.

    Does this help you? Or may I ask you to be more specific to give me the chance to share more in depth knowledge we've gathered.

  • USJCJP3FC
    USJCJP3FC Posts: 3 🧑🏻‍🚀 - Cadet

    I really appreciate your knowledge share. Very interesting approach with the dedicated database table.

    I also like the general feed approach. This makes it also easier for additional services in the future. But i need to discuss this approach of course with the customer because as you said. Capacity and costs…

    Thank you!

  • UM53ZHLTF
    UM53ZHLTF Posts: 43 🧑🏻‍🚀 - Cadet

    Hello, we do also generate them via cronjob and uploading it to permanent storage. For generating we use the storage clients, as the data is already in redis and can be reused for most fields, except shipment cost. Then we map it to different templates in Zed.
    Next step is to make it configurable in the Backend.

  • USJCJP3FC
    USJCJP3FC Posts: 3 🧑🏻‍🚀 - Cadet

    Thank you John!

    Also nice approach! Configurable via backend -> very user friendly.