Bash 获取洋葱网站标题的脚本

Bash script that gets title of onion sites

我正在编写一个 bash 脚本,它给出了一个“.onion”站点列表和每个站点的页面标题 cURL,然后在新文本文件中以以下格式输出:

“页面标题”-“xxxxxx.onion”

这是我目前的代码,但我在如何实现我的目标上遇到了困难。

#
#
echo 'Must have PCRE installed for grep'
echo ''
#
#
# Check for Onion Router connection
#
RESP1="$(curl --socks5-hostname localhost:9150 -s 'https://check.torproject.org')"
#
# echo $RESP1 # DEBUG
RESP2=$(echo "$RESP1" | grep -m 1 "Congratulations" | xargs)
# echo $RESP2 # DEBUG
if [ "$RESP2"="Congratulations. This browser is configured to use Tor." ]
  then
        echo "Connected to the Onion Router"
    else
        echo "Failed to connect to the Onion Router" 
      exit 1
fi
# Grab raw html of site
RESP3="$(xargs -n 1 curl --socks5-hostname localhost:9150 -so - < slist.txt)"
# RESP3="$(curl --socks5-hostname localhost:9150 "$site" -so - )" # OLD
# Grep for title
RESP4=$(echo "$RESP3" | grep -iPo '(?<=<title>)(.*)(?=</title>)')
#
echo $RESP4

看来你快到了。
如果您将脚本的最后一部分替换为:

cat slist.txt \
| while read -r url; do
    # Grab raw html of site
    RESP3="$(curl --socks5-hostname localhost:9150 -so - $url)"
    # Grep for title
    RESP4=$(echo "$RESP3" | grep -iPo '(?<=<title>)(.*)(?=</title>)')
    #
    echo "$RESP4 - $url"
done

只需将脚本的输出重定向到一个文本文件。这是你的目标还是我错过了什么?