How to download a website and convert to PDF?

I wish to convert to PDF all pages from this site:

… but doing this manually will take a long time.

Is there any method to convert all sites inside the attached website like a unique document or PDF?

Haven’t tested this but my approach would be something like:

1.) Download the entire website:

wget --mirror --convert-links --adjust-extension --page-requisites --no-parent https://developer.bricsys.com/bricscad/help/en_US/V25/DevRef/

2.) batch-convert the .html pages to PDF

find . -name "*.html" | while read html_file; do
    # Create output filename
    output_pdf="${html_file%.html}.pdf"
    
    echo "Converting: $html_file -> $output_pdf"
    
    # Convert using wkhtmltopdf
    wkhtmltopdf "$html_file" "$output_pdf"
done

you need to sudo dnf install wkhtmltopdf

1 Like