I have an insulated(from underhood air/windflow) temp sensor on the alternator casing/stator of my externally regulated alternator, and I spin a potentiometer on the dashboard to control that regulator's maximum allowed voltage which of course controls the field current and ultimately alternator amperage output.
At 65mph, with alternator maxed out in the 120 amp range, the stator will not exceed 140f. I added more insulation to protect thermocouple from airflow, as I found this number to be too low, yet it remained nearly exactly the same.
At hot Idle, with ~50 amp maximum output, stator temperature will climb to over 200F in about 4 minutes and likely keep rising, but I lower the voltage, thus amperage into the well depleted battery, or shut off the engine as I do not like Idling excessively for any reason, not even testing.
Obviously rectifier temperature will be different, as will each vehicle at different speeds and alternator outputs. the point is heat does kill, and idling at maxoutput makes much more heat than at highway speeds. On My vehicle, speeds under 25mph a yield little difference in temperature, than Idling parked.
The voltage regulator seeking to bring system voltage to 14.7v is asking the alternator which is charging depleted batteries,to work much harder for much longer than simply maintaining 13.6v. I spin a dial on my dash and can watch voltage and amperage change in accordance with my wishes.
My observations usually irritate people who have an incomplete understanding of the relationship between voltage and amperage when charging batteries. It's sad, yet amusing, that such people give advice and present themselves as authorities. I'm referring to people on Automotive based forums, not necessarily this one, so nobody get their panties all bunched up someplace unpleasant.