So this means it's much easier to connect rail electrificaiton into lower voltage electricity industry supplies than it used to be, rather than at the HV 3 phase supply (for efficiency on return currents). But that kit costs a fair amount (it's much more sophisticated than your basic transformer and switching station, which is what most rail systems have).
Whilst the converter station is substantially more sophisticated and expensive, that is not really reflective of system costs.
Traditional topology 25kV installations are awful loads for the grid - awful power factor with wildly varying and huge phase imbalances. It gets even worse because phase imbalance mitigation means substations cannot be operated in parallel most of the time, so loads jump between adjacent substations and between phases in a manner that is hard for the grid operator to predict in advance or react to.
This can all be mitigated through grid reinforcement work and the like, and for decades there was no other option - but it is certainly not cheap.
As a result, in recent years, as the cost of power electronics has fallen, we have seen a trend towards non-traditional 25kV topologies. The dominant one has become AC-DC-AC full power converters, where the AC supply is turned into medium voltage DC, before being inverted for feed to the traction supply system. The grid is then essentially insulated from whatever happens on the rail side, and in theory this will allow all 25kV substations to operate in parallel and make phase breaks a thing of the past.
The cost of the substation is higher but the system cost is far lower.
Also, can HSTEd expand on what advantages of 25kV are being eroded here: it's a pretty good system, with loads of suppliers and experience (and reputed losses of c.2-3%)? 25kV can also be taken from lower voltage supplies (eg. 66kV in the UK) using power electronics, provided the local distributors are happy having the extra load. Thing is, near the larger cities (typically where the trains are), they are mostly not!
Ultimately, by the time you reach 9kV, losses have reached levels comparable to the 25kV system - at that point losses are driven by things other than the current flow in the trackside conductors.
And if you are going to use power electronics to convert grid AC power into medium voltage DC, then back into highish voltage AC for transmission to the locomotive, where it will be converted back into medium voltage DC for feed to the traction system..... why bother with the conversion back to AC and then back to DC again?
You can just feed the medium voltage DC direct to the locomotive, and avoid two sets of converters and their associated losses and cost, and operate a conceptually far simpler system with no need for a sophisticated 25kV traction pack in the train. Either you build a PETT with a 25kV rated converter into the train, or you lug a huge line frequency transformer around - with 9kVdc you have to do neither, and it's even practical to build your traction pack out of line-connected MOSFETs.
The traditional advantages of 25kV were high power delivery and low losses.
9kV approaches or matches 25kV in those categories whilst having other advantages relating to a simpler power converter system and simpler control (no need for reactive power nonsense, DC is DC).
It is likely competitive with 25kV for green-field installations and blows it out of the water economically on DC-retrofit due to the bonding and other problems with conversion to AC.