Point/Counterpoint on these forums has become so tiresome.
So many of these threads turn into pissing matches when two sides each take absolute but opposite positions. Many times, when I look into the stuff myself, the reality seems to fall somewhere in-between whether the issue being argued is "scientifically possible" and whether it is "likely to be audible in a hi-fi system."
A by-product of these arguments occurs when manufacturers prey on these debates and develop (expensive) "solutions" that are marketed as miracle cures for whatever the alleged issue is that is somehow limiting a listener's satisfaction with their hi-fi system. How many "audiophiles" have a box of no longer used "miracle cures" on their shelf? Just go back and read posts from 5, 10, or 15 years ago about audiophile doo-dads that were considered "almost mandatory" yet are either no longer being used or have been replaced by different (and usually much more expensive) doo-dads. Examples might include certain fuses, DBS wire biasing, Shakti this or that, crystals, certain USB and Ethernet filters, etc., etc.
This Ethernet cable issue seems to have some basis in reality in that yes, running an Ethernet cable next to certain types of power cables/lines can cause EMI and, yes in certain cases, this can cause minor to major degradation of a digital signal. However, the specific situation causing this issue is mostly unlikely and can wholly be avoided by separating the Ethernet cable by even a small distance. In addition, such minor data loss is mostly corrected through Ethernet internal error correction and retry protocols. As well, it seems the Ethernet spec (IEEE802.3) has specific requirements for isolation and resistance to current and voltage spikes to maintain performance in variable EMI and RFI scenarios as well as for electrical safety. As a result, it seems the risk of typical Ethernet cables properly run in a residential environment affecting an audio signal (while not impossible) is probably somewhere between minimal and non-existent. Here is an interesting discussion of the issue, and here is some related stuff on the Roon Labs Community forum.
"But I hear it in my system!" This starts round two of these debates pitting auditory abilities and system capabilities on one side, against possible psychological factors affecting what people believe they hear on the other. There is no winning these arguments since the two sides cannot agree on suitable measurable metrics, or even agreeable protocols for conducting listening tests. On one side, "If you cannot accurately select the cable/tweak/etc. statistically better than 7 out of 10 times, then how can it be making any difference?" The other side says, "DBT doesn't work, and only through long-term listening in your own system can you accurately hear what something is doing." The real problem is not about measurements and protocols but rather that everybody wants to be "right."
The only solution seems to be agreeing on no solution. IOW, share observations and experiences, and let go of the need to be right. If somebody wants to spend an exorbitant amount of money on a rock that is sold to improve SQ when placed in the same room as an audio system, then simply say, "cool, enjoy your rock."