I would have said the XM-L efficiency starts to drop off around 1 to 1.4 amps. I expect no part of the plot is linear so the best efficiency is a low current.
You are correct, no part is linear, so purely in terms of efficiency (lumens/watt) a lower current is "better."
Perhaps the best location might be where the next lower grade LED (XP-G) lumen/watt hits, but then at what current? Or does it ever?
Now you bring up several very real questions, which are more complicated than simply lumens/watt:
1) Is there overlap between XP-G and XM-L in terms of lumens/watt? No. So, in terms of that metric, regardless of current, the XM-L is better.
More importantly,
2) Once you have selected a particular model of LED, how do you look at this graph and choose an operating point, i.e. the current you will run the LEDs at? Unfortunately, this is a very complicated question. Fortunately, even if you get it wrong, as long as you stay within reasonable bounds, you won't be that far off the mark (because the curve is fairly flat). More bad news though, there are many many more factors than meets the eye. You can't make the decision simply on the relative flux/current graph, because you're not operating the LED in an otherwise-consistent environment. As you increase current, you are almost assuredly increasing die temperature, so you need to factor in the relative flux/junction temperature graph as well (page before in the datasheet). Unfortunately we don't really have a way to accurately measure junction temperature, but the implication is very clear - efficiency drops off fast as junction temperature increases - in our case probably more so than the drop from increasing current. So, running at a high current is WORSE than you'd guess simply by looking at the flux/current graph.
Add to this the fact that lumens/watt isn't the only factor in the decision - you must consider upfront cost, unless you're keeping the fixture for a very, very long time. Also, you need to consider design factors - is it "better" for your design to have lots of light from a single point, or less light from more points? I don't think there's a single answer to that question for everyone. Plus, are there drivers that can run the LED at the current you want, in a string length that's convenient? And so on.
To get back to the "good news" I mentioned in the first paragraph, if your head spins, the simple message is, in terms of operating efficiency, i.e. money you spend to run the LED vs light you get from it, lower currents are always better than higher currents - regardless of which LED you're talking about, or what power rating the LED has. Most of us have to factor upfront cost into the decision, since we can't simply buy a ton of LEDs and run them at 100mA. Hence people typically reach a compromise point, where you're running the LEDs at some current lower than their max (to get better efficiency) but still high enough that you're getting good output per emitter. For older LEDs, i.e. XR-E, this meant somewhere around 70% of max most of the time, which worked out to the traditional 700mA target current.
As we get LEDs with higher efficiencies and higher max currents, we have to decide: Do we keep running them at 700mA, which nets much higher efficiency and slightly lower upfront costs, or do we continue to run at around 70% of max, which nets slightly higher efficiency but much lower upfront costs? People are going to have to decide this on their own, based on how long they intend to keep the fixture. Again, if you're going to keep it for longer than a year or two, it makes the most sense to weigh efficiency more heavily than upfront cost.