Long before any use of LLMs, OsmAnd would direct you, if you were driving past Palo Alto, to take a congested offramp to the onramp that faced it across the street. There is no earthly reason to do that; just staying on the freeway is faster and safer.
So it's not obvious to me that patently crazy directions must come from watching people's behavior. Something else is going on.
In Australia the routes seem to be overly influenced by truck drivers, at least out of the cities. Maps will recommend you take some odd town bypass when just going down Main Street is easier.
I imagine what you saw is some other frequent road users making choices that get ranked higher.
> if you were driving past Palo Alto, to take a congested offramp to the onramp that faced it across the street
If you're talking about that left turn into Alma with the long wait instead of going into the Stanford roundabout and then the overpass, it still does that.
I've seen this type of thing with OsmAnd too. My hypothesis is that someone messed up when drawing the map, and made the offramp an extension of the highway. But I haven't actually verified this.
I'm not talking about use of traffic data. In the abstract, assuming you are the only person in the world who owns a car, that route would be a very bad recommendation. Safety concerns would be lower, but still, there's no reason you'd ever do that.
safety concerns would probably actually be higher since the most dangerous places on roads are areas where traffic crosses and conflicts (the road you cross to get from the offramp to onramp)
So it's not obvious to me that patently crazy directions must come from watching people's behavior. Something else is going on.