In my other articles on web scraping with Puppeteer, the code works nicely if it never encounter any errors. But once it hit an error, most likely a timeout error while waiting for page content to load like example below at line 13. The whole scraper will hang and causes memory leak.
These are the error messages you will most probably encounter.
UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch().
And the deprecation warning.
DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code
The solution is to change it into an async function and assign it to a variable. Call the variable with then and catch as shown below.