I’ll introduce HTTrack, a tool for saving entire websites, which I installed and tried on macOS.
 
I’ll use HTTrack Version 3.49-2 (05/20/2017).
brew install httrack
Here’s a practical example of using httrack to save all links within a website.
$ httrack
Welcome to HTTrack Website Copier (Offline Browser) 3.49-2
Copyright (C) 1998-2017 Xavier Roche and other contributors
To see the option list, enter a blank line or try httrack --help
Enter project name :your_project_name
Base path (return=/Users/yourname/websites/) :
Enter URLs (separated by commas or blank spaces) :https://example.com
Action:
(enter)  1  Mirror Web Site(s)
  2  Mirror Web Site(s) with Wizard
  3  Just Get Files Indicated
  4  Mirror ALL links in URLs (Multiple Mirror)
  5  Test Links In URLs (Bookmark Test)
  0  Quit
: 4
Proxy (return=none) :
You can define wildcards, like: -*.gif +www.*.com/*.zip -*img_*.zip
Wildcards (return=none) :
You can define additional options, such as recurse level (-r), separated by blank spaces
To see the option list, type help
Additional options (return=none) :
---> Wizard command line: httrack https://example.com  -O "/Users/yourname/websites/your_project_name" --mirrorlinks  -%v  
Ready to launch the mirror? (Y/n) :Y
Mirror launched on Fri, 18 Oct 2019 13:32:47 by HTTrack Website Copier/3.49-2 [XR&CO'2014]
mirroring https://example.com with the wizard help..
Done.
Thanks for using HTTrack!
 With the above settings, the download was saved to /Users/yourname/websites/your_project_name. You can open the index.html file in this directory with a browser to display the top page of the downloaded website.
cd /Users/yourname/websites/your_project_name
open index.html
That’s all from the Gemba.