为什么这个简单的网络抓取会失败?
Why is this simple webcrawl failing?
我正在使用非常简单的代码片段测试 HtmlAgilityPack,但它仍然失败,我不明白为什么
var html2 = @"http://www.monki.com/en_sek/newin/view-all-new.html";
HtmlWeb web2 = new HtmlWeb();
var htmldoc2 = web2.Load(html2);
var node2 = htmldoc2.DocumentNode.SelectSingleNode("//head/title");
Console.WriteLine("\n\n\n\n");
Console.WriteLine("Node Name2: " + node2.Name + "\n" + node2.OuterHtml + "\n" + node2.InnerText);
我当然已经检查过页面上有标题和标题,但 node2 仍然为空,我无法弄清楚为什么。
该网页似乎正在尝试设置 cookie。另见 this answer with the same problem
var loader = new HtmlWeb{ UseCookies = true };
var doc = loader.Load(@"http://www.monki.com/en_sek/newin/view-all-new.html");
var node2 = doc.DocumentNode.SelectSingleNode("//head/title");
Console.WriteLine("\n\n\n\n");
Console.WriteLine("Node Name2: " + node2.Name + "\n" + node2.OuterHtml + "\n" + node2.InnerText);
我正在使用非常简单的代码片段测试 HtmlAgilityPack,但它仍然失败,我不明白为什么
var html2 = @"http://www.monki.com/en_sek/newin/view-all-new.html";
HtmlWeb web2 = new HtmlWeb();
var htmldoc2 = web2.Load(html2);
var node2 = htmldoc2.DocumentNode.SelectSingleNode("//head/title");
Console.WriteLine("\n\n\n\n");
Console.WriteLine("Node Name2: " + node2.Name + "\n" + node2.OuterHtml + "\n" + node2.InnerText);
我当然已经检查过页面上有标题和标题,但 node2 仍然为空,我无法弄清楚为什么。
该网页似乎正在尝试设置 cookie。另见 this answer with the same problem
var loader = new HtmlWeb{ UseCookies = true };
var doc = loader.Load(@"http://www.monki.com/en_sek/newin/view-all-new.html");
var node2 = doc.DocumentNode.SelectSingleNode("//head/title");
Console.WriteLine("\n\n\n\n");
Console.WriteLine("Node Name2: " + node2.Name + "\n" + node2.OuterHtml + "\n" + node2.InnerText);