I'm trying to use Node Red to scrape the ingredients lists of various products from a supermarket's website. When I send an http request to the supermarket's website I receive the response "access denied" followed by a link to the edgesuite website.
People who'd reported similar issues usually find that a missing User-Agent is the root cause of the problem. Indeed, when I explicitly set the User-Agent in a curl request to the supermarket's website:
curl -v "https://www.ah.nl" -H "User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:132.0) Gecko/20100101 Firefox/132.0"
then my error message disappears. However, adding the same User-Agent info to the http request in Node Red still returns the same error. I don't have any issues resolving other websites with Node Red. Any suggestions? Cheers
[
{
"id": "8ef4745c1aaeba27",
"type": "http request",
"z": "57a70889b0fabe0a",
"name": "",
"method": "GET",
"ret": "txt",
"paytoqs": "ignore",
"url": "https://www.ah.nl",
"tls": "",
"persist": false,
"proxy": "",
"insecureHTTPParser": false,
"authType": "",
"senderr": false,
"headers": [
{
"keyType": "User-Agent",
"keyValue": "",
"valueType": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:132.0) Gecko/20100101 Firefox/132.0",
"valueValue": ""
}
],
"x": 650,
"y": 120,
"wires": [
[
"4a864bfb9a60c86e"
]
]
}
]