Many infosec professionals joined Las Vegas to attend the BlackHat security conference. As I’m not part of those lucky people so I’m waiting for the presentations (they are published when the talk is completed). But I don’t have time to lose sitting in front of my computer and pressing F5… So let’s crawl them automatically…
#!/bin/bash curl -s https://www.blackhat.com/us-17/briefings.html\ | egrep 'https://www.blackhat.com/docs/us-17/.*\.pdf' \ | awk -F '"' '{ print $4 }' \ | while read URL do F=$(basename $URL)  if [ ! -r $F ]; then   curl -s -o $F $URL   echo "Scrapped $f"  fi done
Quick and dirty but it works…
Tx! No idea why the [] were lost in the middle of the copy/paste 🙂
Thanks! Worked with my bash after adjusting it to:
#!/bin/bash
curl -s https://www.blackhat.com/us-17/briefings.html\
| egrep ‘https://www.blackhat.com/docs/us-17/.*\.pdf’ \
| awk -F ‘”‘ ‘{ print $4 }’ \
| while read URL; do
F=$(basename $URL)
if [ ! -r $F ]; then
curl -s -o $F $URL
echo “Scrapped $F”
fi
done