With the proliferation of the internet, PC's and smartphones, the world has become flooded with data. This creates opportunity for organizations that are creative at finding the right data and manipulating it to make it useful. Technology provides many powerful, low-cost solutions to help. Web-scraping is one of them. Here are some interesting use cases for the FE&S industry.
KaTom Pricing & Lead-Times
Every FE&S dealer must develop a high-quality, omni-channel experience for their customers. This requires a real-time understanding of how your online offerings compare to your biggest competitors. Prices and lead-times are two of the more relevant factors.
As you can see by this video (here), a basic software program can scrape KaTom 's website to identify their pricing and lead-times for products within a specific category (heated shelves). This program is only 53 lines of code, took 15-20 minutes to develop, and takes ~10 minutes to run. In addition, reconfiguring it for a different product category, takes only a few seconds (changing the target URL).
OpenTable Restaurant Information
Wouldn't it be great if you had a spreadsheet that listed every restaurant in a city with all their relevant information (food type, quality, price points, street address, contact email, etc.)? This information can be used to identify more capex and resupply business opportunities AND increase your brand awareness with restaurant owners for future new stores.
This can be achieved by web-scraping OpenTable. OpenTable is the #1 restaurant reservation site in the world with over 60 thousand restaurants on their platform. It has a treasure trove of information for foodservice distributors.
Creating this data through web-scraping software is a 3-step process:
- Select a city/region and web-scrape every restaurant link in that geographic area
- Loop through all of those restaurant links and web-scrape their detailed information
- Loop through all these restaurant websites (obtained through Step 2) and web-scrape any email addresses listed on their website
This software program is only marginally more complex than the KaTom web-scraper. It is approximately 300 lines of code, took ~3 hours to develop and takes a few hours to run for a large city. The output is a well-formatted excel spreadsheet that is easy to read and analyze.
Similar to the KaTom scraper, once it's been built for one city, reconfiguring it for another city takes a few seconds (changing the target URL).
Here is a link to every restaurant (905) in Charlotte, North Carolina: Data Here