• Our booking engine at tickets.railforums.co.uk (powered by TrainSplit) helps support the running of the forum with every ticket purchase! Find out more and ask any questions/give us feedback in this thread!

1500V DC OHL Electrification - More Robust?

Status
Not open for further replies.

kermit

Member
Joined
2 May 2011
Messages
592
Fascinating recent thread about 1500V DC test tracks at Crewe and Longsight, with some observations about the robust engineering that went in to the Class 76 (and presumably Class 77) Manchester to Sheffield locos. So...compared with 25kV AC systems, with some locos and multiple units that had a propensity (for instance) to catch fire, and OHLE that seems to break quite regularly, was the whole Woodhead system "over engineered" (ie reliable)? Were there failures in snow, for instance, which must have been a regular hazard up in the Pennines?
 
Last edited:
Sponsor Post - registered members do not see these adverts; click here to register, or click here to log in
R

RailUK Forums

AM9

Veteran Member
Joined
13 May 2014
Messages
14,191
Location
St Albans
The UK 1500VDC OLE was built heavily because:
a) it was designed to deliver a higher current (at a lower voltage). For a given power, the current at 1500VDC would be over 16 times higher than 25kV.
b) the two main 1500VDC electrification schemes in the UK (Liverpool St to Shenfield and Manchester to Shefield) were designed in the 1930s and built without any compensation for temperature change. This meant that the wires and supports were much heavier to enable the chnage in wire tension to be constrained.
The modern 25kV equivalent of the original DC schemes are operated at far higher speeds and service frequencies, e.g. most of the trains on the Manchester to Sheffield line had a maximum speed of 65mph, wheras 100mph is typical of most 25kV lines and many operate trains at 125mph with 3 minute headways.
 
Last edited:

edwin_m

Veteran Member
Joined
21 Apr 2013
Messages
24,793
Location
Nottingham
The DC system was well understood by the 1930s when the Woodhead scheme was designed. The 25kV system only appeared in the 1950s and included some new technology such as the rectifiers and (in the UK) the voltage changeover system. By the mid-1960s it all seemed to be working reliably although cost-cutting in design and/or maintenance on some of the later schemes has caused some problems since.
 

Springs Branch

Established Member
Joined
7 Nov 2013
Messages
1,418
Location
Where my keyboard has no £ key
The UK 1500VDC OLE was built heavily because:
a) it was designed to deliver a higher current (at a lower voltage). For a given power, the current at 1500VDC would be over 16 times higher than 25kV
Never really thought about it before, but is that the reason you always saw Class 76s running around with both their pantographs up, while the 25kV AC locos could manage with just one?

Imagine hauling a loaded mgr coal train uphill to Woodhead tunnel with only one panto, it would be glowing red-hot with the current passing through! (or more likely wouldn't be able to draw enough current to keep up speed)
 

Bald Rick

Veteran Member
Joined
28 Sep 2010
Messages
29,070
Never really thought about it before, but is that the reason you always saw Class 76s running around with both their pantographs up, while the 25kV AC locos could manage with just one?

Yes, that was why.

Because of the high currents in the 1500vDC system, the overhead cables were thicker than is now standard for AC. There was also more of it e.g. the auxiliary wire in the compound catenary, but also more current carrying droppers and other connections to make sure the current could get to the contact wire without causing any hot spots (literally). Copper is very dense, and therefore with the added thickness and extra cables what was ‘in the air’ was roughly three times the weight of what is in place for typical AC electrification. Hence the need for much stouter gantries etc.

As @AM9 says, the D.C. kit was all fixed tension, and therefore suffered in really hot weather (above about 34C). Not a problem in Manchester and the Pennines, but very much a problem between Liverpool St and Shenfield. I remember in the very hot spell of August 2003 receiving a call from colleague standing at Stratford saying ‘umm, the OLE is sagging quite a bit, indeed it’s resting on the roof of the train in Platform 10.’
 

matchmaker

Established Member
Joined
8 Mar 2009
Messages
1,499
Location
Central Scotland
The DC system was well understood by the 1930s when the Woodhead scheme was designed. The 25kV system only appeared in the 1950s and included some new technology such as the rectifiers and (in the UK) the voltage changeover system. By the mid-1960s it all seemed to be working reliably although cost-cutting in design and/or maintenance on some of the later schemes has caused some problems since.

Indeed. There were very serious problems with the transfomers, rectifiers and voltage changeover equipment in the early days of 25kV - especially on the North Clyde lines. Link to the official report on the Railway Archive Accident Report
 

Taunton

Established Member
Joined
1 Aug 2013
Messages
10,018
Never really thought about it before, but is that the reason you always saw Class 76s running around with both their pantographs up, while the 25kV AC locos could manage with just one?

Imagine hauling a loaded mgr coal train uphill to Woodhead tunnel with only one panto, it would be glowing red-hot with the current passing through! (or more likely wouldn't be able to draw enough current to keep up speed)
That doesn't necessarily follow. The French SNCF 200 mph-plus world speed records of the 1950s were done on 1,500v DC south of Bordeaux. Despite the huge power drawn, the locomotives only had one pantograph up. At maximum, when one collector strip started to disintegrate, due to the friction, the driver lowered one and raised the other.

Bear in mind that although the Current is much higher, this is only part of the electrical equation, the Voltage is equally lower.

Indeed. There were very serious problems with the transfomers, rectifiers and voltage changeover equipment in the early days of 25kV - especially on the North Clyde lines. Link to the official report on the Railway Archive Accident Report
It wasn't an inherent issue with 25Kv. When the system was introduced, government policy, faithfully followed by BR, was to give encouragement to the several then existing heavy electrical firms (who later all merged up into GEC) to give UK companies a substantial boost in the electrical export markets, so the orders were shared between them. As it was a new technology, each company followed their own, somewhat different approach to things, with what were effectively prototypes being produced in volume straight off the drawing board. The serious problems which occurred, in both Glasgow and NE London, were confined to just one of the manufacturers; the others involved did not have these issues. Likewise with the initial 25Kv locomotives, which were allocated to five separate designers/manufacturers, classes 81-85, although they all shared the same body and looked identical. Some were better than others. As with other areas of modern traction, North British (Class 84) became the bottom feeder in this.

The official report, linked above, follows the government line and emphasises the assistance to industry line, while marginalising that it was only one of the manufacturers designs that caused the issues (though it's there if you read the fine detail). That such an official enquiry was necessary was driven by the government desire not to let development of the systems, to suit export markets, get sidetracked compared to other competing country manufacturers, though given that one (of several) of the on-train explosions actually caused passenger fatalities it was indeed a serious technical failure.
 
Last edited:

O L Leigh

Established Member
Joined
20 Jan 2006
Messages
5,612
Location
In the cab with the paper
About 9 years ago I posted the following, which I reproduce here in full:

"Quite by chance I have a copy of the snappily titled Second Interim Report on the Accidents and Failures which occurred in Multiple Unit Electric Trains in the Scottish Region and Eastern Region British Railways.

"While this largely concentrating on the problems with the Eastern's GEC-equipped units (later Cl305) on the "NE London Suburban Electrified Lines" (or the West Anglia routes, as we call it today), it also makes mention of progress made towards modifications with Scotland's AEI-equipped units (later Cl303). This document is dated 30 May 1961 and refers to "accidents" on 13 and 17 December 1960, the last of which resulted in a withdrawal of the entire Glasgow suburban electric fleet, and to an interim report dated 13 January 1961 in which the causes of the failures were established and the work to be done to rectify them was outlined.

"While the traction equipment was different on each fleet there appears to have been a number of common issues, specifically when operating on the 6.25kV AC OLE. Both fleets suffered transient over-voltages that caused some similar and some different problems. As well as better ventilation to deal with problems with the mercury arc rectifiers and the need to fit better surge suppression to protect the motors, the Glasgow units required a full transformer re-build as it was found that "...excess current conditions had caused the collapse of the original transformer secondary windings".

"As at 3 May 1961, 68 AEI transformers had been rebuilt to the new specifications with 32 modified units having been delivered back from the Dukinfield Railway Works near Manchester for trial running and driver training. A test train comprising two modified units kitted out as a mobile laboratory had been run successfully over the nights of 21, 22 and 23 March 1961 and was planned to run for a total of 20000 miles. Each modified unit was also required to run for 2000 miles before being returned to service.

"In Glasgow there was little alternative but to return to steam operation, but on the NE London suburban lines the troublesome GEC units were relegated to the Chingford line and the Colchester-Clacton line where any failures would not cause massive knock-on delays, while the electric service was propped up by diverting English Electric units intended for the London, Tilbury & Southend electrification scheme and from the Colchester-Clacton line (probably a mix of Cl302's and Cl308's).

"Quite how this story ends I cannot say because I have not seen the promised final report. However, it appears that your visit must have coincided with this saga."

I was then linked to report linked to again above. In response I posted the following additional:

"I've just had a quick read and it is a bit of a "mind f**k" but, whether you understand the technical information or not, it does neatly illustrate the follies of attempting to design something in a laboratory.

"As the report says, all these EMUs coming into service on the Glasgow, Manchester and London Eastern Region suburban networks (including those ER units converted from 1500V DC operation) were effectively a huge experiment in rail electrification, as no other network had at that time attempted to create a dual high voltage system of AC overhead electrification. Therefore unreliability was bound to follow, especially on lines where trains would be expected to use both the dual voltages of 6.25kV and 25kV AC. Components were designed and built based on experimental parameters and did not sufficiently take into account the actual demands made on them in operation. Therefore failures occurred that could not have been predicted, the precise nature of which depended largely on the design and construction of individual components from contractors used to equip the traction for individual routes.

"However, it seems that the causes were fairly common, at least where they weren't down to poor component design. Pan bounce, which could lead to "chopping" of the Air Blast Breaker (we now use Vacuum Breakers which do not "chop"), and incorrect operation of the APC Voltage Selection Equipment which, under certain conditions, could occasionally malfunction and apply 25kV to the 6.25kV circuits imposed large transient over-voltages onto the electrical equipment, especially on the 6.25kV AC lines where they could reach voltages as high as 6 times the normal operational maximum as well as generating very high currents.

"On the AEI equipped Glasgow units this lead to overheating and distortion of the secondary windings due to electro-magnetic force which, in at least two instances, caused the transformer oil to gasify and resulted in an explosion in the guards compartment. On the EE equipped London North Eastern units these over-voltages took their toll on the tertiary circuits, blowing out battery chargers as well as traction motors instead of the transformers, which were of a more robust design. The LNE units did also suffer with problems associated with the mercury arc rectifiers, some of which was down to "chopping" of the ABB and some due to the cooling arrangements for the anodes and cathodes.

"History shows that they got there in the end, although the removal of the 6.25kV sections helped enormously. The various upgrades and modifications to both the trains and the fixed lineside equipment ironed out all these problems, and lessons learned were applied to later generations of EMUs."

My reading of the situation is that the problem was not associated with issues connected to the 25kV AC sections but rather the dual voltage sections where the voltage changed between 25kV AC and 6.25kV AC. Taken together with inadequate transformer protection on these early units and there was scope for some fairly large explosions.
 

edwin_m

Veteran Member
Joined
21 Apr 2013
Messages
24,793
Location
Nottingham
That doesn't necessarily follow. The French SNCF 200 mph-plus world speed records of the 1950s were done on 1,500v DC south of Bordeaux. Despite the huge power drawn, the locomotives only had one pantograph up. At maximum, when one collector strip started to disintegrate, due to the friction, the driver lowered one and raised the other.

Bear in mind that although the Current is much higher, this is only part of the electrical equation, the Voltage is equally lower.
The voltage isn't relevant in this case. Voltage is always between two places, current flows through one place. The supply voltage is (loosely speaking) between the wire and the rails so most of it is between the pantograph head and the rail with only a small part between the head and the wire. That small part is (again simplifying somewhat) equal to the current passing through multiplied by the resistance of the interface, so the voltage relevant to this situation is actually more for a lower-voltage DC system. Power dissipated depends on the product of the voltage across whatever is dissipating the power and the current through it, so in this case will be proportional to the square of the current (the so called I squared R). So if a 1500V DC and a 25kV AC pantograph have the same interface resistance, the heat being generated electrically in the former will be around 278 times greater. In both cases this will be additional to the heat being generated by friction.

It's a bit more complicated when the pantograph leaves the wire and arcing takes place. The heat generated in the arc still depends on the current but the propensity to arc at a particular separation depends on the voltage. However the heavier DC pantograph is more likely to separate from the wire, but the lighter AC wire is more likely to separate from the pantograph. And an arc is more likely to die out on an AC system when the oscillating voltage crosses zero.

Some of the DC trains that raise two pans on startup lower one of them once they get moving. I think this is because the critical factor is to limit the amount of heat being generated in a particular part of the wire. If there is a lag of a second before the train moves appreciably, then a second's worth of heat is being generated in one pan-width of wire. As the train moves faster but the current stays constant, the heat generated over a second is distributed over a longer piece of wire so the temperature rise in the wire is less.

In the case of the French record breaker, it was a short formation that accelerated rapidly so would probably have got moving too quickly to have any heating effects, or they could have started with two pantographs raised anyway. For any high speed attempt they would want only one pan raised because the vibration in the wire from the first one would have caused very poor contact in the second.
 

36270k

Member
Joined
7 Jan 2015
Messages
210
Location
Trimley
The main weakness of the 1500v DC system over Woodhead was the 11KV or 33kv buried cable connecting the substations together which would often burn out cable ends.
Strafford substation would be pushed to the limit with an MGR train with 2 76's on the front and 2 on the rear. ( This was on the 1 in 40 gradient between Worsborough and Penistone )
 

Taunton

Established Member
Joined
1 Aug 2013
Messages
10,018
"In Glasgow there was little alternative but to return to steam operation, but on the NE London suburban lines the troublesome GEC units were relegated to the Chingford line and the Colchester-Clacton line where any failures would not cause massive knock-on delays, while the electric service was propped up by diverting English Electric units intended for the London, Tilbury & Southend electrification scheme and from the Colchester-Clacton line (probably a mix of Cl302's and Cl308's).

On the EE equipped London North Eastern units these over-voltages took their toll on the tertiary circuits, blowing out battery chargers as well as traction motors instead of the transformers, which were of a more robust design.
I think there's some confusion there between the (308) EE and (305) GEC-equipped units in NE London, otherwise identical-looking, as I understand it the latter were the troublesome type. Relegating the GEC units to Chingford may have avoided "knock on delays", but was principally as the line was wholly 6.25Kv, and the voltage changeover was not involved; likewise the whole of the Southend Victoria line, converted from DC, was at the lower voltage without change point. The only changeovers were beyond Cheshunt, on the Lea Valley, and beyond Shenfield, on the main line. Everything inside that was 6.25Kv. I don't know if the trains confined to 6.25Kv actually had their changeover equipment disabled.

It seems whoever designed the voltage changeover equipments somehow believed it would operate utterly faultlessly and didn't need failsafe protection. And of course, the original intention to support export markets was quite lost with all of this, as no overseas railways were going to have these automatic AC changeovers.

Gerry Fiennes, in "I tried to Run a Railway", who must have been a senior manager through all this, wrote that whoever suggested changing over the totally reliable GE DC lines to AC with changeovers etc "should be stoned as a false prophet". Which brings us back to the original question here!
 
Last edited:

O L Leigh

Established Member
Joined
20 Jan 2006
Messages
5,612
Location
In the cab with the paper
I think there's some confusion there between the (308) EE and (305) GEC-equipped units in NE London, otherwise identical-looking, as I understand it the latter were the troublesome type.

To be entirely fair, you are picking me up on a 9 year old proofing error. I have simply reproduced what I posted at the time in full without alteration.

Relegating the GEC units to Chingford may have avoided "knock on delays", but was principally as the line was wholly 6.25Kv, and the voltage changeover was not involved; likewise the whole of the Southend Victoria line, converted from DC, was at the lower voltage without change point. The only changeovers were beyond Cheshunt, on the Lea Valley, and beyond Shenfield, on the main line. Everything inside that was 6.25Kv.

Yes, hence the temporary change of allocation. Having the GEC units kept to the one electrification system reduced the risk of further major failures as a consequence of incorrect voltage selection causing blockages on the main routes out of Liverpool Street. But it also prevented these units from causing any mischief even if they choose to go pop on a Chingford turn, whereas this would not necessarily have been the case if the train was heading to/from Enfield Town. The service pattern at that time meant that all routes north along the Lea Valley could still be accessed and any blockage caused by a GEC unit could be bypassed, limiting any disruption solely to the Chingford branch. Don't forget that these GEC units suffered reliability issues with the mercury arc rectifiers that were nothing to do with the voltage changeovers.

But then, all of this is somewhat off-topic. I simply reproduced my posts from all those years back to illustrate that early EMUs suffering major transformer failures was not as a consequence of the 25kV AC electrification equipment, as was suggested.

It seems whoever designed the voltage changeover equipments somehow believed it would operate utterly faultlessly and didn't need failsafe protection. And of course, the original intention to support export markets was quite lost with all of this, as no overseas railways were going to have these automatic AC changeovers.

The spectacular nature of these failures was not primarily to do with the faulty operation of the voltage changeover equipment but because no-one had thought to fit any failsafe protection to the transformers. My basic traction is 1980s BR EMUs (specifically Cl317) and these units have all sorts of failsafe protections to prevent damage to the transformer and any repeat of these spectacular failures. There was surge protection, primary and secondary overload protection, a Bucholz relay to detect gassing in the transformer oil, a pressure release valve to vent the oil in the event of over-pressure; all of which would open the VCB on the roof automatically and isolate the train from the power supply. As far as I can tell, almost none of these failsafe protections were applied to the original fleet of EMUs being used across the network, but if they had I'm sure that we wouldn't be talking about exploding transformers.

Gerry Fiennes, in "I tried to Run a Railway", who must have been a senior manager through all this, wrote that whoever suggested changing over the totally reliable GE DC lines to AC with changeovers etc "should be stoned as a false prophet". Which brings us back to the original question here!

With the greatest respect to Gerry, that sounds like hindsight. Yes there were technical issues connected with the voltage changeovers, but clearly not all classes of EMU were struggling with it. Clearly it should have all been done at 25kV AC OLE right from the outset, but you don't always know that something won't work until it's tried and/or tested.
 
Last edited:

AM9

Veteran Member
Joined
13 May 2014
Messages
14,191
Location
St Albans
To be entirely fair, you are picking me up on a 9 year old proofing error. I have simply reproduced what I posted at the time in full without alteration.



Yes, hence the temporary change of allocation. Having the GEC units kept to the one electrification system reduced the risk of further major failures as a consequence of incorrect voltage selection causing blockages on the main routes out of Liverpool Street. But it also prevented these units from causing any mischief even if they choose to go pop on a Chingford turn, whereas this would not necessarily have been the case if the train was heading to/from Enfield Town. The service pattern at that time meant that all routes north along the Lea Valley could still be accessed and any blockage caused by a GEC unit could be bypassed, limiting any disruption solely to the Chingford branch. Don't forget that these GEC units suffered reliability issues with the mercury arc rectifiers that were nothing to do with the voltage changeovers.

But then, all of this is somewhat off-topic. I simply reproduced my posts from all those years back to illustrate that early EMUs suffering major transformer failures was not as a consequence of the 25kV AC electrification equipment, as was suggested.



The spectacular nature of these failures was not primarily to do with the faulty operation of the voltage changeover equipment but because no-one had thought to fit any failsafe protection to the transformers. My basic traction is 1980s BR EMUs (specifically Cl317) and these units have all sorts of failsafe protections to prevent damage to the transformer and any repeat of these spectacular failures. There was surge protection, primary and secondary overload protection, a Bucholz relay to detect gassing in the transformer oil, a pressure release valve to vent the oil in the event of over-pressure; all of which would open the VCB on the roof automatically and isolate the train from the power supply. As far as I can tell, almost none of these failsafe protections were applied to the original fleet of EMUs being used across the network, but if they had I'm sure that we wouldn't be talking about exploding transformers.



With the greatest respect to Gerry, that sounds like hindsight. Yes there were technical issues connected with the voltage changeovers, but clearly not all classes of EMU were struggling with it. Clearly it should have all been done at 25kV AC OLE right from the outset, but you don't always know that something won't work until it's tried and/or tested.
Ironically, the classes 306 & 307 which were converted from the original 1500VDC 3-car sliding door stock and 4-car slam door stock, didn't suffer from the same issues as the GEC units and both carried on in full service until the early '80s and early '90s respectively, despite working through the 6.25/25kV changeover points regularly.
 
Last edited:

Bald Rick

Veteran Member
Joined
28 Sep 2010
Messages
29,070
With the greatest respect to Gerry, that sounds like hindsight. Yes there were technical issues connected with the voltage changeovers, but clearly not all classes of EMU were struggling with it.

Not just hindsight, but myopic hindsight at that.
 

36270k

Member
Joined
7 Jan 2015
Messages
210
Location
Trimley
The main problem with the 303 and 305 units was the mercury-arc rectifiers. ( Later replaced by silicon diodes )
Faulty rectifiers would put pulses of dead short across the transformer secondary.
On the Glasgow units, this would cause heating and loosening of the secondary winding causing shorts that would boil the cooling oil.
 

AM9

Veteran Member
Joined
13 May 2014
Messages
14,191
Location
St Albans
Fascinating recent thread about 1500v DC test tracks at Crewe and Longsight, with some observations about the robust engineering that went in to the Class 76 (and presumably Class 77) Manchester to Sheffield locos. So...compared with 25kv AC systems, with some locos and multiple units that had a propensity (for instance) to catch fire, and OHLE that seems to break quite regularly, was the whole Woodhead system "over engineered" (ie reliable)? Were there failures in snow, for instance, which must have been a regular hazard up in the Pennines?
So back on topic, 1500VDC is probably no more reliable than 25kV when taking a holistic view:
1500VDC positives:
slightly lower clearances required: - not as much as might be imagined. In dry air, electricity can jump about 1mm per 3kV. Thus for 1500VDC it is less than 1mm. For 25kVac (rms) the equivalent distance is the peak voltage divided by 3 mm, therefore, 25000 x sqrt 2 /3 mm = under 12mm. Of course dry air is not that common in the UK so allowance is made for airborne pollution/water* mainly on insulators. The major part of the clearance is because of movement due partly to wind but mainly to oscillations from pantograph upward pressure and train movement.
Less equipment required on trains, at the most basic end opf the scale ac supplies require a step-down transformer and a rectifier set.
1500VDC negatives:
1500VDC requires 16 time the current that 25kV requires for the same traction power level. To put this into perspective, take a typical modern EMU, say an eight car class 700. This has a maximum traction power demand of over 3.3MW. Under a 25kV supply, that means drawing 132 amps on two pantographs. Under a 1500VDC supply the current would be 2,200 amps. This means that the whole supply system needs to be built with heavier OLE conductors, that require heavier catenary wires, heavier registration components and heavier masts/gantries. It doesn't stop there, even with the beefier OLE, the votage drops are much greater such that to maintain adequate voltage at the pantograph, the line needs to be fed more frequently because of the heavier current, so there needs to be DC supply points every 5-10 miles as compared to the typical 25kVac norm of every 40 miles. Given that getting feeds from the national grid at 132kV is a major cost driver in an electrification project, the cost of providing that on an mainline scheme every 5-10 miles would be prohibitive, (and not that much cheaper if a high voltage feed alond the formation daisy-chained the DC feed points). There's also the fact that protecting the line against overload like with 3rd rail, providing protection against overloads is much less effective. The protection needs to discriminate between a peak load of one or more trains in a feed section, and a fault such as a short circuit or tree in contact with the conductors. Failure to address the latter will destroy the OLE, or even involve surrounding infrastructure. False tripping will reduce the reliability of the train services.
Here is a rather poor video of a South African metro train where the cable has a fault and as it gets hotter it sags and shorts against the roof of the train. That short current isn't that much more than the maximum load that the supply expects to be drawn by trains but it is enough to set the train roof on fire, - repeatedly! With 25kV, there would be an initial arc and then the protection would kick in and shut the power down.


Apologies for the quality, - there is a better quality video somewhere but I can't locate it. That was on a 3kVDC system, the problem is worse on a 1.5kVDC installation.
 
Last edited:

O L Leigh

Established Member
Joined
20 Jan 2006
Messages
5,612
Location
In the cab with the paper
Ironically, the classes 306 & 307 which were converted from the original 1500VDC 3-car sliding door stock and 4-car slam door stock, didn't suffer from the same issues as the GEC units and both carried on in full service until the early '80s and early '90s respectively, despite working through the 6.25/25kV changeover points regularly.

I'm not sure that it is ironic, or even particularly telling. Bear in mind that, although these trains were indeed older, the equipment used to convert them from 1500V DC to 6.25kV/25kV AC operation was of the same generation as that being fitted to the newer trains being designed and built specifically for the purpose. Also, while these units will have strayed north of Shenfield towards Chelmsford and Colchester from time to time, their primary use was on the lower 6.25kV AC sections between Liverpool St and Shenfield (Cl306) and Liverpool St and Southend Vic (Cl307). Therefore they were not experiencing the same number of voltage changeovers as those units that routinely operated to destinations north of Shenfield or Cheshunt.

It's also worth remembering that the fact that these units (and others) were not caught under the scope of this report does not mean that they were not suffering similar issues affecting their reliability. It just means that they weren't trashing their transformers or exploding.
 

AM9

Veteran Member
Joined
13 May 2014
Messages
14,191
Location
St Albans
I'm not sure that it is ironic, or even particularly telling. Bear in mind that, although these trains were indeed older, the equipment used to convert them from 1500V DC to 6.25kV/25kV AC operation was of the same generation as that being fitted to the newer trains being designed and built specifically for the purpose. Also, while these units will have strayed north of Shenfield towards Chelmsford and Colchester from time to time, their primary use was on the lower 6.25kV AC sections between Liverpool St and Shenfield (Cl306) and Liverpool St and Southend Vic (Cl307). Therefore they were not experiencing the same number of voltage changeovers as those units that routinely operated to destinations north of Shenfield or Cheshunt.

It's also worth remembering that the fact that these units (and others) were not caught under the scope of this report does not mean that they were not suffering similar issues affecting their reliability. It just means that they weren't trashing their transformers or exploding.
You could be right of course, but I used to commute on the GEML, (after the initial problems) and frequently on the 306s, - occasionally on the 307s. The main problem that I remember with the 306s was the repeated failures to transition from series to parallel with the clack clack of the contactors and the jerking of the motor cars. That fault would have been an issue when the were operating on DC.
Many of those journeys were on 306s from Chelmsford to Ilford or Shenfield to Chelmsford, thereby passing through the Mountnessing changeover point. I was on a Saturday up class 306 when we were delayed after Brentwood eventually crawling past a smouldering class 305 (No. 518) on the up fasts, - that would have been in the early '70s.
 

Whistler40145

Established Member
Joined
30 Apr 2010
Messages
5,911
Location
Lancashire
IIRC a Class 83 AC Electric loco was used at Longsight for voltage conversion with the 1500v DC EMUs after Reddish depot closed
 

Taunton

Established Member
Joined
1 Aug 2013
Messages
10,018
The description above seems to minimise any AC issues and overstate any DC ones.

Fundamentally, until very recent times (much more recent than the stock we are discussing), power was obtained from the grid at AC, very high voltage, and finally fed to the motors at DC, lower voltage. So along the way there needed to be a transformer and a rectifier. This substation can either be lineside, with DC wiring, or on the train, with AC wiring. So each AC electric locomotive is actually a substation. With emus you need one per unit, so a 12-car train has three substations. And because the public grid supply is not at 25Kv, you still need lineside substations anyway.

It is sort of apparent that it is more straightforward to build substations in a lineside building than under the frame of numerous emu cars. Among other things you can provide replacement equipment, which happens reasonably often in an advancing technology, in the lineside building and then change over, rather than under the emu which takes it out of service while you are doing so. Likewise, 1950s-60s rectifiers used the Mercury Arc process, with a tank of mercury. This should surely be apparent that large mercury-filled tanks, which overall in an emu weigh several tons, are more practical in lineside buildings (where they performed flawlessly) rather than underneath a rolling rail vehicle, all the while needing to maintain the critical gap from the electrodes to the mercury liquid surface. Sure, it ought to work ... and the moment solid state silicon rectifiers came along (1970?), they started replacement. Incidentally, when were the AC unit rectifiers replaced? And were they all?

Yes, DC systems need more lineside substations. Is it the old classic of infrastructure providers sticking their expense across onto the rolling stock providers, something that has occurred (and apparently continues to) with a range of issues over time?

The other thing you can do is use AC motors. This is what the German/Swiss/Scandinavian railways have always done (commonly at 15Kv), with a different approach to base frequency, not needing a train-borne rectifier at all. It would be interesting to discuss why BR in the late 1950s did not go for this long-established approach, rather than a new-at-the-time French development of 25Kv.
 

edwin_m

Veteran Member
Joined
21 Apr 2013
Messages
24,793
Location
Nottingham
I'm not sure that it is ironic, or even particularly telling. Bear in mind that, although these trains were indeed older, the equipment used to convert them from 1500V DC to 6.25kV/25kV AC operation was of the same generation as that being fitted to the newer trains being designed and built specifically for the purpose. Also, while these units will have strayed north of Shenfield towards Chelmsford and Colchester from time to time, their primary use was on the lower 6.25kV AC sections between Liverpool St and Shenfield (Cl306) and Liverpool St and Southend Vic (Cl307). Therefore they were not experiencing the same number of voltage changeovers as those units that routinely operated to destinations north of Shenfield or Cheshunt.

It's also worth remembering that the fact that these units (and others) were not caught under the scope of this report does not mean that they were not suffering similar issues affecting their reliability. It just means that they weren't trashing their transformers or exploding.
Same generation but not the same complexity, as these units retained their DC control equipment so the transformer would only have had to cope with the dual supply voltage, without also having a tap changer to vary the secondary voltage going to the motors. However the transformer and rectifier were retrofitted, which is often a recipe for unreliability.

I've no idea whether the presence of the tap changer was a major contributor either to the explosion problems or to general unreliability.
 

edwin_m

Veteran Member
Joined
21 Apr 2013
Messages
24,793
Location
Nottingham
The description above seems to minimise any AC issues and overstate any DC ones.

Fundamentally, until very recent times (much more recent than the stock we are discussing), power was obtained from the grid at AC, very high voltage, and finally fed to the motors at DC, lower voltage. So along the way there needed to be a transformer and a rectifier. This substation can either be lineside, with DC wiring, or on the train, with AC wiring. So each AC electric locomotive is actually a substation. With emus you need one per unit, so a 12-car train has three substations. And because the public grid supply is not at 25Kv, you still need lineside substations anyway.

It is sort of apparent that it is more straightforward to build substations in a lineside building than under the frame of numerous emu cars. Among other things you can provide replacement equipment, which happens reasonably often in an advancing technology, in the lineside building and then change over, rather than under the emu which takes it out of service while you are doing so. Likewise, 1950s-60s rectifiers used the Mercury Arc process, with a tank of mercury. This should surely be apparent that large mercury-filled tanks, which overall in an emu weigh several tons, are more practical in lineside buildings (where they performed flawlessly) rather than underneath a rolling rail vehicle, all the while needing to maintain the critical gap from the electrodes to the mercury liquid surface. Sure, it ought to work ... and the moment solid state silicon rectifiers came along (1970?), they started replacement. Incidentally, when were the AC unit rectifiers replaced? And were they all?

Yes, DC systems need more lineside substations. Is it the old classic of infrastructure providers sticking their expense across onto the rolling stock providers, something that has occurred (and apparently continues to) with a range of issues over time?

The other thing you can do is use AC motors. This is what the German/Swiss/Scandinavian railways have always done (commonly at 15Kv), with a different approach to base frequency, not needing a train-borne rectifier at all. It would be interesting to discuss why BR in the late 1950s did not go for this long-established approach, rather than a new-at-the-time French development of 25Kv.
It's horses for courses. The shorter-distance the operation the better it suits low-voltage DC. That's universal for trams, with the added benefit that lower voltage reduces hazards in a shared environment. DC is probably still optimum for Metros, and may even have been so for the original LBSC inner-suburban electrifications which were similar in character. But certainly not the optimum system for runs to the coast or as far as Weymouth. At the opposite extreme there are/were even a couple of heavy haul railways that went to 50kV AC. But since 25kV was invented in the early 1950s I believe every mixed-traffic railway starting from scratch with electrification has adopted that system. Many railways already using lower voltages have decided to go to 25kV for new electrification projects and some are even converting their existing electrified routes.

Yes every train has to carry a "substation", but that means that if a "substation" fails it's only a question of repairing one unit rather than degrading service or even closing the whole route (I'd guess DC substations have to be closer together in practice than theoretically possible because of that issue). If a long formation has three "substations" then if one fails the other two will be enough to get it home.

So countless numbers of railway administrations have asked the same question and got the same answer, that the energy losses and other inefficiencies of a DC system outweigh the penalty of extra rolling stock weight and cost on a higher-voltage AC system.

Germanium diodes were first used on AC traction in the mid-60s I believe, and in view of their obvious advantages I think semiconductors replaced mercury rectifiers pretty quickly.
 

Revaulx

Member
Joined
17 Sep 2019
Messages
476
Location
Saddleworth
Did all the early AC locos and units use Mercury Arc rectifiers? The 84s certainly did and were a disaster; the 83s had them and also spent a long period out of service, despite being built by the normally dependable English Electric.

The AM4s might have been a bit rubbish, but I don't recall them ever being unreliable, let alone blowing up.

It can't just have been the 25/6.25 changeovers that caused the problems, as I don't thing the 83s and 84s ever ran under anything but 25kV
 

Taunton

Established Member
Joined
1 Aug 2013
Messages
10,018
DC is probably still optimum for Metros, and may even have been so for the original LBSC inner-suburban electrifications which were similar in character. But certainly not the optimum system for runs to the coast or as far as Weymouth.
Well there are plenty of systems around with substantial DC operation. Down the length of Italy, across the deserts in South Africa. And then there's the Trans-Siberian in Russia. A bit longer than a trip to Weymouth ...

I agree, it's not a simple yes/no - it's horses for courses.
 

apk55

Member
Joined
7 Jul 2011
Messages
438
Location
Altrincham
Any system can be made reliable if enough effort is spent on getting things right. It may take time with a new system to find out all the problems but most are solvable with experience.

Another problem with DC systems is the return voltage drop. What current goes down the overhead comes back down the rails. This introduces a voltage drop in the rails which must be kept to a few volts for safety reasons as well as causing corrosion problems. Bonds between rails must be much heavier than with an AC system. Another problem is signaling and heavy impeadance bonds are required to separate track circuit signaling currents from traction currents. In contrast on AC systems the return voltage drop is lower because of lower current and there are techniques available the reduce the voltage drop further such as booster transformer and auto transformer feeds. Also it is common to only use one rail for traction current and use the other rail for signaling .

Most AC supply systems are a lot harder and have proportional less voltage drop from substation to train than a DC system. In DC systems voltage drops of 20% are not unknown but in AC systems it is rarely more than 10%. Therefore an AC system is normally more efficient.
 

Richard Scott

Established Member
Joined
13 Dec 2018
Messages
3,673
The other thing you can do is use AC motors. This is what the German/Swiss/Scandinavian railways have always done (commonly at 15Kv), with a different approach to base frequency, not needing a train-borne rectifier at all. It would be interesting to discuss why BR in the late 1950s did not go for this long-established approach, rather than a new-at-the-time French development of 25Kv.
It was my understanding that German/Swiss etc used DC motors, this was possible as the current changed direction is did so through both armature and field hence motor still rotated same way. The reason for the 16 2/3Hz supply was to reduce arcing on the commutator, which would have been a serious problem at 50Hz. Can anyone clarify?
 

O L Leigh

Established Member
Joined
20 Jan 2006
Messages
5,612
Location
In the cab with the paper
You could be right of course, but I used to commute on the GEML, (after the initial problems) and frequently on the 306s, - occasionally on the 307s. The main problem that I remember with the 306s was the repeated failures to transition from series to parallel with the clack clack of the contactors and the jerking of the motor cars. That fault would have been an issue when the were operating on DC.
Many of those journeys were on 306s from Chelmsford to Ilford or Shenfield to Chelmsford, thereby passing through the Mountnessing changeover point. I was on a Saturday up class 306 when we were delayed after Brentwood eventually crawling past a smouldering class 305 (No. 518) on the up fasts, - that would have been in the early '70s.

I'll just retract my original assertion about the converted units not blowing up their transformers, as I have now spent the afternoon reading the final report into the various misadventures of these early EMUs.

It seems that there were five major transformer failures affecting these particular units shortly after conversion to AC operation caused by the automatic power control (APC) voltage changeover switch not operating and the 6.25kV AC circuits being exposed to 25kV AC. The first four resulted in explosions (the fourth also caused a fire) but the fifth was not so destructive only because the air-blast breaker (ABB) opened in time. The other four were so destructive that they set up a sustained short circuit to the rectifier frame which was only dealt with once the line breakers had operated, but by that time the transformers were toast.

1950s-60s rectifiers used the Mercury Arc process, with a tank of mercury. This should surely be apparent that large mercury-filled tanks, which overall in an emu weigh several tons, are more practical in lineside buildings (where they performed flawlessly) rather than underneath a rolling rail vehicle, all the while needing to maintain the critical gap from the electrodes to the mercury liquid surface. Sure, it ought to work ... and the moment solid state silicon rectifiers came along (1970?), they started replacement. Incidentally, when were the AC unit rectifiers replaced? And were they all?

As you say, mercury arc rectifiers were the state of the art at the time, but technology was moving quickly forwards. But the problems connected with their use in railway traction was not the amount of mercury required nor it's sloshing around, but how to maintain the correct temperature differentials between anode and cathode and what happened if you fail to keep them all excited when there were interruptions to the power supply. They were certainly finnicky things to set-up and operate and could easily cause faults which would affect reliability, particularly if you lost excitation.

For the Cl303s this was achieved through modification of the cooling arrangements and they proved to work tolerably well, but the GEC Cl305s were a harder nut to crack and the final report recommends that they be quickly changed to silicon rectifiers. The Cl302s which also had mercury arc rectifiers seemed to have worked fine straight out of the box with the only explosion being a capacitor divider on the APC equipment which blew the mounting plate into the guard's compartment below and injured the guard. However, this was found to be a one-off as no repeats occurred (although they did fit stronger mounting plates together with an explosion diaphragm).

Same generation but not the same complexity, as these units retained their DC control equipment so the transformer would only have had to cope with the dual supply voltage, without also having a tap changer to vary the secondary voltage going to the motors. However the transformer and rectifier were retrofitted, which is often a recipe for unreliability.

I've no idea whether the presence of the tap changer was a major contributor either to the explosion problems or to general unreliability.

No bearing at all. Tap-changers were damaged due to arcs inside the transformer, but they were not the cause.

Did all the early AC locos and units use Mercury Arc rectifiers? The 84s certainly did and were a disaster; the 83s had them and also spent a long period out of service, despite being built by the normally dependable English Electric.

The AM4s might have been a bit rubbish, but I don't recall them ever being unreliable, let alone blowing up.

It can't just have been the 25/6.25 changeovers that caused the problems, as I don't thing the 83s and 84s ever ran under anything but 25kV

AM4s, together with the AM8s and the converted AM6s and AM7s, never had mercury arc rectifiers. The AM4s, AM6s and AM7s were fitted with germanium rectifiers while the AM8s and subsequent EMU builds had silicon diode rectifiers. Although built for dual-voltage operation, all the electrification along the WCML through the Midlands and North-West was energised at 25kV AC, so the only time that AM4s ever operated on 6.25kV was when seconded to the Eastern Region to cover for the side-lined GEC AM5s.

The voltage changeovers were largely responsible for setting the environment in which these failures could occur and some were directly to blame. The design of the modified APC voltage changeover switch meant that it would only operate if the ABB was open (or the pantograph lowered), as it should do automatically at a neutral section. For various reasons this didn't always happen, and the five major transformer failures affecting the converted DC units I mentioned above were all directly attributable to this. All five of these units had just passed the changeover point at Shenfield and, for various reasons, the ABB failed to open, meaning that the APC voltage changeover switch didn't operate and they entered the 25kV AC section still set for 6.25kV AC. In this, almost every type of dual-voltage unit suffered a failure of some sort resulting in the 6.25kV AC circuits being exposed to four times the designed voltage, but not all suffered catastrophic damage.

The reason for the modification to the APC voltage changeover switch was that it could be hoodwinked into selecting the wrong setting by induced currents or by fault conditions in the OLE. The investigations in Glasgow with the AEI equipped Cl303s identified a number of incidents where this happened. Experiments showed that an unearthed but otherwise dead section of OLE next to an energised line at 25kV AC would have in it an induced voltage of 5kV which was well within the range that the APC voltage changeover switch would be looking for to identify a line at 6.25kV AC. Also if there was an OLE fault the falling voltage gradient from the end of the section towards the fault could cause the train to misidentify the supply voltage. In each case when the power was restored the train would be exposed to 25kV AC while still set for 6.25kV AC. Normally this would cause the Primary Overload Relay to trip and open the ABB, but by that time the transformer would have already been exposed to strong electro-magnetic forces which could cause shifting of the windings, abrasion of the insulation and even arcs caused by short circuits.

Admittedly some of the problems were not caused by the voltage changeover but by running on the 6.25kV AC system. Not being an electrical engineer I can't explain the reason, but these fault conditions caused by things like pan bounce, ABB "chopping" and the associated knock-on effect on the mercury arc rectifiers meant that the transient over-voltages caused were 4 times higher on 6.25kV AC than they would be on 25kV AC.
 
Last edited:

O L Leigh

Established Member
Joined
20 Jan 2006
Messages
5,612
Location
In the cab with the paper
Not just hindsight, but myopic hindsight at that.

I'm most of the way through Simon Bradley's excellent The Railways: Nation, Network & People and he makes mention of the former General Manager of BR's Eastern Region, Gerard Twisleton-Wykeham-Fiennes, noting that he was sacked for being "...excessively candid about the shortcomings of railway management...". He may have had some good ideas, but he does seem to have been a "my way or the highway" kind of chap.
 

43096

On Moderation
Joined
23 Nov 2015
Messages
15,162
I'm most of the way through Simon Bradley's excellent The Railways: Nation, Network & People and he makes mention of the former General Manager of BR's Eastern Region, Gerard Twisleton-Wykeham-Fiennes, noting that he was sacked for being "...excessively candid about the shortcomings of railway management...". He may have had some good ideas, but he does seem to have been a "my way or the highway" kind of chap.
He was sacked after extracts from his autobiography (https://www.amazon.co.uk/Tried-Run-Railway-Gerard-Fiennes/dp/1784977365) were published in Modern Railways and came to the attention of the BRB.
 

Taunton

Established Member
Joined
1 Aug 2013
Messages
10,018
I'm most of the way through Simon Bradley's excellent The Railways: Nation, Network & People and he makes mention of the former General Manager of BR's Eastern Region, Gerard Twisleton-Wykeham-Fiennes, noting that he was sacked for being "...excessively candid about the shortcomings of railway management...". He may have had some good ideas, but he does seem to have been a "my way or the highway" kind of chap.
He was sacked for two lines in his book, describing how they put the plan together, as instructed, to merge the ER and NER in about March of the year, saying that they could do it from the following January. They had done all the work, all the Ministry had to do now was say yes. They then waited. It took the Minister until about December 10th to do this simple task, adding that they still expected everyone to be in place and making the savings from January 1. Fiennes made a comment ...

He had written candid but fascinating articles in Modern Railways for years, right back to Trains Illustrated days, a number with criticism of his own role in things. But you mustn't write one single word which criticises the Minister (Barbara Castle, but more likely the Permanent Secretary was the timewaster) in any way. So the word was passed down from Babs.

Regarding good ideas, he had many that Beeching authorised and were implemented. Merry-go-Round coal; Freightliner; Inter-City high speed frequent expresses (rather than heavy once a day ones) etc. All his. Thought up in his bath apparently.

Now, back to emus. One noticeable thing is that despite all these happenings, no emu vehicles were ever written off because of the failures, they were all repairable. More than can be said for fires on the LMR AC locomotives, where several were damaged beyond repair.
 
Last edited:
Status
Not open for further replies.

Top