Whats the best way to capture website login details/parameters for use in HTTP node

Guys,

Want to automate some downloading and saving of website data. The login site presents a basic login form over https where i input a username/password combo - there is currently no API provided - how can i capture what is being input to be able to use the HTTP node to automate the same functions.

Once i have logged in - there is an additional form that is presented to selected parameters to be used in the downloaded report that i will need to then tackle.

[EDIT] Having asked my friend ChatGPT it is suggesting using the puppeter node for NR and handing of to it for the heavy lifting - would others agree with this approach ?

Craig

Craig

It may depend how the website is behaving, if it is a 'simple' login form that sets a cookie, it can be quite easily be handled by using http request nodes. Puppeteer is another option, but it is slower as it uses a brower to emulate a user.

It will be a matter of using the web inspector to determine what request is being fired of when you made your parameter selections for the report and see where the pages goes. Using the web inspector you can export the HAR open and it in some text editor of your choice and reverse engineer the request.