What are the Slack Archives?
It’s a history of our time together in the Slack Community! There’s a ton of knowledge in here, so feel free to search through the archives for a possible answer to your question.
Because this space is not active, you won’t be able to create a new post or comment here. If you have a question or want to start a discussion about something, head over to our categories and pick one to post in! You can always refer back to a post from Slack Archives if needed; just copy the link to use it as a reference..
Hello everyone, We are planning to implement product feeds in our project. Its about google shoppin
Hello everyone,
We are planning to implement product feeds in our project. Its about google shopping, Addroll Criteo, AWIN and Exponea.
Do you have some recommended way of doing this in general?
We are thinking about using glue api vs. pregenerate the feeds for every service vs. create one “main” feed and use external tools to generate feeds out of this main feed for every service.
Any experience here? Thank you 🙂
Comments
-
We generate them via Cronjob and upload to an azure storage with a public interface provided by azure (secured by a token of course).
Why? Depending on the size of your feed it will take some time for a bot (e.g.: Google Bot) to download the file and you might run into time limits or block one connection for a longer time, which reduces your overall connection pool.
Another thing we did is generating a database table for each feed and use a big SQL query to fill in the data (a lot more performant for us than doing it in PHP) and only use PHP to write out the feed file. We use common table expressions in postgres for this task.
We use one writer for each format (e.g.: Google, Criteo, etc.) but only because we didn't want to manage another service to map a general feed to specific feeds (cost & personal capacity).
In general if you have a marketing department with deep know how on product feeds I would suggest to create a general feed and map it into a specific feeds and therefore move the decision what structure from your general feed needs to be mapped into specific structures to your marketing experts.Does this help you? Or may I ask you to be more specific to give me the chance to share more in depth knowledge we've gathered.
0 -
I really appreciate your knowledge share. Very interesting approach with the dedicated database table.
I also like the general feed approach. This makes it also easier for additional services in the future. But i need to discuss this approach of course with the customer because as you said. Capacity and costs…
Thank you!
0 -
Hello, we do also generate them via cronjob and uploading it to permanent storage. For generating we use the storage clients, as the data is already in redis and can be reused for most fields, except shipment cost. Then we map it to different templates in Zed.
Next step is to make it configurable in the Backend.0 -
Thank you John!
Also nice approach! Configurable via backend -> very user friendly.
0
Categories
- All Categories
- 42 Getting Started & Guidelines
- 7 Getting Started in the Community
- 8 Additional Resources
- 7 Community Ideas and Feedback
- 75 Spryker News
- 920 Developer Corner
- 780 Spryker Development
- 89 Spryker Dev Environment
- 362 Spryker Releases
- 3 Oryx frontend framework
- 34 Propel ORM
- 68 Community Projects
- 3 Community Ideation Board
- 30 Hackathon
- 3 PHP Bridge
- 6 Gacela Project
- 25 Job Opportunities
- 3.2K 📜 Slack Archives
- 116 Academy
- 5 Business Users
- 370 Docker
- 551 Slack General
- 2K Help
- 75 Knowledge Sharing
- 6 Random Stuff
- 4 Code Testing
- 32 Product & Business Questions
- 70 Spryker Safari Questions
- 50 Random