Breadcrumbs

Feeds code samples

  • Handle streaming:

    The Shopping API platform exposes feeds over HTTPS in a streaming output. Find out more about how to handle streaming in the streaming guide.

  • Parallel requests:

    Please note that you should not ask for too much concurrent feeds. There is a limitation on parallel downloads for the same user.

Below are some code samples written in Bash to help you understand how you should trigger the calls to Shopping API in order to retrieve offers.

We don't provide code samples in other languages like Java because it could be written in many different ways with many different libraries.

Bash code samples should be expressive enough so that you get the algorithm idea and implement it with your language of choice. If you do need help, please contact us.

In these samples we use the JSON format and the jq tool to handle it from Bash script.

Prerequisite: jq must be installed on your system.

Get all offers

Download offers feeds in one file

If you want to store all the data in one file:

Bash
#!/usr/bin/env bash

# Variables to update
COUNTRY="fr"
TOKEN="your token generated from the Publisher Center"
SHOPPING_API_URL="https://api.kelkoogroup.net/publisher/shopping/v2"

function main() {

  echo "Downloading all offers with one request"
  local FILENAME="offersFeed_${COUNTRY}.json.gz"
  curl -sS -o ${FILENAME} -H "Accept-Encoding: gzip" -H "Authorization: Bearer ${TOKEN}"  -s "${SHOPPING_API_URL}/feeds/offers?country=${COUNTRY}"

  local RES=$?
  if [ "${RES}" = "0" ]; then
    echo "Feed downloaded successfully in ${FILENAME}"
  else
    echo "Error when downloading feed: ${RES}"
  fi
}

main

Download offers feeds in parts

If you want to store all the data in multiple files:

Bash
#!/usr/bin/env bash

# Variables to update
COUNTRY="fr"
TOKEN="your token generated from the Publisher Center"
SHOPPING_API_URL="https://api.kelkoogroup.net/publisher/shopping/v2"
NUMBER_OF_PARTS="8" # 2, 4, or 8

function download_and_store_offer_feed() {
  local PART="$1"
  echo "Downloading offers feed part ${PART}..."
  local FILENAME="offersFeed_${PART}.json.gz"
  curl -sS -o ${FILENAME} -H "Accept-Encoding: gzip" -H "Authorization: Bearer ${TOKEN}" "${SHOPPING_API_URL}/feeds/offers?country=${COUNTRY}&part=${PART}&numberOfParts=8"
  local RES=$?
  if [ "${RES}" = "0" ]; then
    echo "Feed downloaded successfully in ${FILENAME}"
  else
    echo "Error when downloading feed: ${RES}"
  fi
}

function main() {
  get_list_of_categories

  # In 8 parts
  for PART in {1..${NUMBER_OF_PARTS}}
  do
    download_and_store_offer_feed "${PART}"
  done
}

main

Download offers feeds by category

Please read first the offers by category guide.

If you just want to download the feeds as regular files by category, you can apply following logic:

Bash
#!/usr/bin/env bash

# Variables to update
COUNTRY="fr"
TOKEN="your token generated from the Publisher Center"
SHOPPING_API_URL="https://api.kelkoogroup.net/publisher/shopping/v2"

# Global variables
CATEGORY_IDS=""

function get_list_of_categories() {
  echo "Retrieving the list of categories..."
  # Use level 2 or 3 no more otherwise you won't get all offers
  local CATEGORIES_RESPONSE=$(curl --compressed -H "Authorization: Bearer ${TOKEN}"  -s "${SHOPPING_API_URL}/feeds/category-list?country=${COUNTRY}&format=json&level=3")
  CATEGORY_IDS=$(echo "${CATEGORIES_RESPONSE}" | jq -r '.[].id')
  local NB_CATEGORIES=$(echo "${CATEGORY_IDS}" | wc -l)
  echo "Category list retrieved with ${NB_CATEGORIES} categories"
}

function download_and_store_offer_feed() {
  local CATEGORY_ID="$1"
  echo "Downloading offers feed for categoryID=${CATEGORY_ID}..."
  local FILENAME="offersFeed_${CATEGORY_ID}.json.gz"
  curl -sS -o ${FILENAME} -H "Accept-Encoding: gzip" -H "Authorization: Bearer ${TOKEN}" "${SHOPPING_API_URL}/feeds/offers?country=${COUNTRY}&&categoryId=${CATEGORY_ID}"
  local RES=$?
  if [ "${RES}" = "0" ]; then
    echo "Feed downloaded successfully in ${FILENAME}"
  else
    echo "Error when downloading feed: ${RES}"
  fi
}

function main() {
  get_list_of_categories

  # For each categoryId, call Shopping Feeds and store the response as a file
  echo "${CATEGORY_IDS}" | while read CATEGORY_ID; do
    download_and_store_offer_feed "${CATEGORY_ID}"
  done
}

main

Download offers feeds by merchant

If you just want to download the feeds as regular files by merchant, you can apply following logic:

Bash
#!/usr/bin/env bash

# Variables to update
COUNTRY="fr"
TOKEN="your token generated from the Publisher Center"
SHOPPING_API_URL="https://api.kelkoogroup.net/publisher/shopping/v2"

# Global variables
MERCHANT_IDS=""

function get_list_of_merchants() {
  echo "Retrieving the list of merchants..."
  local MERCHANTS_RESPONSE=$(curl --compressed -H "Authorization: Bearer ${TOKEN}"  -s "${SHOPPING_API_URL}/feeds/merchants?country=${COUNTRY}&format=json")
  MERCHANT_IDS=$(echo "${MERCHANTS_RESPONSE}" | jq -r '.[].id')
  local NB_MERCHANTS=$(echo "${MERCHANT_IDS}" | wc -l)
  echo "Merchants list retrieved with ${NB_MERCHANTS} merchants"
}

function download_and_store_offer_feed() {
  local MERCHANT_ID="$1"
  echo "Downloading offers feed for merchantID=${MERCHANT_ID}..."
  local FILENAME="offersFeed_${MERCHANT_ID}.json.gz"
  curl -sS -o ${FILENAME}  -H "Accept-Encoding: gzip" -H "Authorization: Bearer ${TOKEN}"  "${SHOPPING_API_URL}/feeds/offers?country=${COUNTRY}&merchantId=${MERCHANT_ID}"
  local RES=$?
  if [ "${RES}" = "0" ]; then
    echo "Feed downloaded successfully in ${FILENAME}"
  else
    echo "Error when downloading feed: ${RES}"
  fi
}

function main() {
  get_list_of_merchants

  # For each merchantId, call Shopping Feeds and store the response as a file
  echo "${MERCHANT_IDS}" | while read MERCHANT_ID; do
    download_and_store_offer_feed "${MERCHANT_ID}"
  done
}

main

Get some offers

If you need to get only a subset of the offers, you might be interested only in a restricted list of merchants of categories. In this case, adapt the function get_list_of_categories() or get_list_of_merchants() to get only the offers you want.

Process offers on the fly

If you want to start processing offers as soon as you get them, you can apply following logic:

Bash
#!/usr/bin/env bash

# Variables to update
COUNTRY="fr"
TOKEN="your token generated from the Publisher Center"
SHOPPING_API_URL="https://api.kelkoogroup.net/publisher/shopping/v2"


# Global variables
MERCHANT_IDS=""

function get_list_of_merchants() {
  echo "Retrieving the list of merchants..."
  local MERCHANTS_RESPONSE=$(curl --compressed -H "Authorization: Bearer ${TOKEN}"  -s "${SHOPPING_API_URL}/feeds/merchants?country=${COUNTRY}&format=json")
  MERCHANT_IDS=$(echo "${MERCHANTS_RESPONSE}" | jq -r '.[].id')
  local NB_MERCHANTS=$(echo "${MERCHANT_IDS}" | wc -l)
  echo "Merchants list retrieved with ${NB_MERCHANTS} merchants"
}

function download_and_process_feed() {
  local MERCHANT_ID="$1"
  echo "Downloading offers feed for merchantID=${MERCHANT_ID}..."
  curl -sS --compressed -H "Accept-Encoding: gzip" -H "Authorization: Bearer ${TOKEN}"  "${SHOPPING_API_URL}/feeds/offers?country=${COUNTRY}&merchantId=${MERCHANT_ID}" | \
    jq -r -c '.[]' | \
    while read OFFER; do
      process_offer "${OFFER}"
    done
}

function process_offer() {
  local OFFER="$1"
  # Do whatever you want, OFFER contains an offer as a JSON string
  local OFFER_ID=$( echo "$OFFER" | jq .offerId )
  local TITLE=$( echo "$OFFER" | jq .title )
  echo "Processing offer ${OFFER_ID} with title ${TITLE}"
}

function main() {
  get_list_of_merchants
    
  # For each merchantId, call Shopping Feeds and process the offers
  echo "${MERCHANT_IDS}" | while read MERCHANT_ID; do
    download_and_process_feed "${MERCHANT_ID}"
  done
}

main

The same principle applies to a breakdown by category or merchant.

Multiple downloads in parallel

Although you are limited in terms of parallel downloads, it is possible to perform more than one call at a time if necessary.

The recommended way is to parallelize using the "download offer feeds by parts" method: choose the number of parts you want and download in parallel each part.

To implement it in Bash, you can take advantage of tools like parallel or the wait and & operators.