Just because it has been debated extensively doesn't mean it's not a wart.
Having run into the "reverse range does not contain what you obviously expect it to contain" before and wasting a few hours on it, like many other people have and will continue to do in the future, definitely makes me want to call it a wart.
IMO, the current behavior is correct, it would be absolutely horrible when having lower bound higher bound suddenly reversed the order instead of producing an empty result. (think dynamic ranges, not hardcoded ones). Perhaps the operator should be .>>., not .. as an improvement.
Rust tries to be conservative with its semantics. You can always create your own range type and implement a deref/from for it to convert it to Rust range.
It seems like in this case Rust actually failed to act conservatively in its semantics, by allowing ranges on overly generic types in a way that doesn't make sense.
I think the problem here is more that the semantics are documented in the "documentation", but are contrary to the intuition derived from the semantics in the type system.
`.contains()` is only implemented for `Range<Idx: PartialOrd>`, which to me implies that when checking whether a value is contained in the Range, it has enough knowledge about the ordering of numbers that it should be able to still do a bounds check on reversely ordered numbers.