Updated: Oct 23
Scenario: Every bank creates offers for its customers on their websites and apps. Customers find it challenging to discover these offers and redeem before they expire, especially when holding multiple credit cards and figuring out how to redeem them at retail stores and restaurants.
The Ask: Consolidate all offers from all banks into a single place, classify by bank, card type, business domain of travel, dining etc, keep them in sync with the bank’s website. Banks do not publish these offers by API or feeds, so they have to be extracted from websites, interpreted and standardized so they can be presented in the same way to the customer irrespective of the bank.
Solution: A lowcode crawler is built to configure websites using a simple ruleset and extensible with code at runtime, so new websites can be configured fast without extensive code changes. The crawler is built to mimic a real user behavior, to handle static and dynamic websites where offers are visible after a few user interactions. It is built as a cluster of virtual browsers handling multiple pages and a pipeline to handle offers as they are identified, data extracted and normalized to a standard format.