So I have around 12 esp32's with relays and temperature sensors for my smart home. Each of them publishes the temperature to Node Red via MQTT every few seconds.
Could these be slowing down my network? Over the past months (as I've been amassing the esp's) my TP link has started to have degrading connection. Is MQTT in theory going to significantly slow a network or is it likely just how I'm implementing it?
No, not at that rate. If you were publishing thousands a second then possibly.
No, that’s why MQTT it is called “lightweight”
But the frequency is a small Faktor. Even if I do not think that is a problem for every device you have always one open connection.
But perhaps you have a infinite loop? But even than that will slow down the client but not the network.
Use MQTT explorer and see how many messages are transferred over your broker.
Updates every few seconds only shows you the noise of the sensor and no other benefit. I oversample the readings over on minute on the esp and only send this value. That gives a stable noise free data stream.
Good to detect temp changes on a finer level without being misled by the sensor noise.
That depends what you are measuring. For room temperature you are correct, but if you try to control, for example, radiator temperature (heated by flowing hot water) then the temperature can change dramatically over just a few seconds.
You are right. That raises the question if interval updates on a somehow fixed frequency makes sense
An example. My garden irrigation system has a pump, two pressure sensors before and after a filter and a flow sensor. Most of the time there is nothing to chat about (only state messages everything is fine)
But when it comes into action frequent measurements are interesting.
Or my thermostats. They only send a update on relevant changes (0.2C) to keep the sending limit (duty cycle) on the 868khz Band)
So we have beefy dual core ESPs (or I still think ESP8266 are sufficient for most of HA tasks) so let them do more then stupid reporting on a fixed interval filling up databases or travel trough the hole network and nodes to be trashed at the end (or absolutely invisible on a dashboard node slowing everything down in some way)
Data (acquisition) is cheap processing and intelligent (?) decision making not.
No definitely not fixed the same for the whole system. Update rates should be set for each parameter based on the requirements for that measurement. Sometimes that does mean fetching data for some parameters more frequently than is necessary because it may not be practical to fetch some parameter from a device without fetching other parameters at the same time. In that case the unneeded samples should be dropped before being, for example, saved in a database.
LAN should not be impacted by some packets from 12 ESP's. Try wireshark and be amazed how much data is being transferred continously over a network.
But perhaps you mean the 2.4Ghz band, as it could be saturated.
how many clients are there in total?
Most standard SoHo routers are limited in the number of wireless clients supported.
This is not a matter of bandwidth but simply the number of concurrent connections...you should examine the clients, how often they actually need to re-establish the link.
Edit: see here: https://groups.google.com/d/msg/sonoffusers/9PJnwo0SGD0/XLsIW1pFAQAJ
This topic has been part of a series of recent forum spams, so closing this one.