• Our booking engine at tickets.railforums.co.uk (powered by TrainSplit) helps support the running of the forum with every ticket purchase! Find out more and ask any questions/give us feedback in this thread!

Signal failure in the Croydon area (18/12)

Status
Not open for further replies.

Mcr Warrior

Veteran Member
Joined
8 Jan 2009
Messages
11,861
Victoria to Horsham via Sutton.
Horsham to Three Bridges
Three Bridges to Brighton

Might be quicker to use a bicycle.

That roundabout routing was offered about 15 years ago during a period of planned engineering work on the Brighton Main Line north of Gatwick Airport between Christmas and the New Year.

Journey times were significantly extended (over two-and-a-half hours between London and Brighton if I rightly recall!)
 
Sponsor Post - registered members do not see these adverts; click here to register, or click here to log in
R

RailUK Forums

mirodo

Member
Joined
7 Nov 2011
Messages
644
Not entirely sure why Sutton route had all its trains cancelled as well, as far as I know it was not under the affected area where everything was stopped, but all of the services were stopped on that route as well?

East Croydon, Selhurst and Norwood Junction areas were all affected. There’s a map showing exactly where was affected in the link in the post AlexNL made just before yours.
 

Jona26

Member
Joined
2 Jan 2013
Messages
273
Location
West Sussex
There is still a huge flatbed mounted generator parked up at Gatwick on the A23 where the Soith Terminal building crosses the road with the road down to one lane in the southbound direction.
 

London Trains

Member
Joined
9 Oct 2017
Messages
912
East Croydon, Selhurst and Norwood Junction areas were all affected. There’s a map showing exactly where was affected in the link in the post AlexNL made just before yours.

That diagram proves my point, the Victoria to Horsham/Dorking/Epsom services should have not been so affected by this since they did not come anywhere near the failure area, since they turn off the mainline after Balham, way before Norbury. I understand some are formed off other stock and there may have been delays by the failure, but the service on that line was crippled, with services being halted in stations mid journey (just like the trains in the failure area) when they could have completed them fine. This line from Epsom to Dorking has also had the total withdrawal of SWR Services due to the strike and landslip, providing another reason why they should have ran at least some trains this way.
 

Sunset route

Established Member
Joined
27 Oct 2015
Messages
1,189
That diagram proves my point, the Victoria to Horsham/Dorking/Epsom services should have not been so affected by this since they did not come anywhere near the failure area, since they turn off the mainline after Balham, way before Norbury. I understand some are formed off other stock and there may have been delays by the failure, but the service on that line was crippled, with services being halted in stations mid journey (just like the trains in the failure area) when they could have completed them fine. This line from Epsom to Dorking has also had the total withdrawal of SWR Services due to the strike and landslip, providing another reason why they should have ran at least some trains this way.

But in the detail of the NR text it says,


“Three areas of signal control shut themselves down:

  • East Croydon
  • Selhurst
  • Norwood Junction
The signalling power supply at Streatham also shut itself down.”

So that rules out the Streatham Junctions and the route via Mitcham Junction for a time at least.
 

flitwickbeds

Member
Joined
19 Apr 2017
Messages
529
My wife was waiting at London Bridge for the 1641 to Bedford when the power surge happened. Her train hadn't quite got to East Croydon - Traksy showed it being the next train waiting to enter ECR. She decided to head to King's Cross to see if she could get to somewhere like Arlesey instead, but all of those trains also needed to come through the Thameslink core so we're also all delayed or cancelled. She then ran over to Euston and got on a Bletchley bound train around 1750, finally arriving home (after a taxi from Millbrook) at just before 8pm.

My journey home was also interesting. I left work around 7.15 and headed to Farringdon. There was barely anyone there and all the screens were showing the holding "DISRUPTION! Screens will only show trains we know are running" - but with zero trains listed.

I looked on the National Rail app and found a northbound train that was 45 minutes away, and a southbound train that was 5 minutes away. I got on the southbound to Blackfriars (nicer place to wait than Farringdon!) but by that time the northbound train has been cancelled, and the next one according to the platform screens (which weren't in disruption mode) and NR app was something like 50 minutes away.

Out of interest I looked on RTT and could see a VSTP train slowly - VERY slowly! - making its way into London Bridge with stops listed to Bedford.

The announcer on the platform every few minutes was apologising and specifically saying that the boards were accurate and the next train was at least 40 minutes away. Meanwhile this train on RTT was still crawling its way into LBG.

It then arrived into LBG and sat there for a few minutes. As it was 107 late I thought maybe they were going to turn it around and send it back southbound and that the NR app and screens were correct and RTT was wrong. But then it showed as passing some more signals and junctions between LBG and BFF and I did a little whoop to myself when I saw its lights coming round the curve into the station! The announcer was totally stumped and said something like "ladies and gentlemen, I have no idea where this train has come from or where it's going to or stopping at" to which the driver announced it was all stops to STP, then SAC and all stops to BDM. Result!

At Farringdon the screens were still showing the disruption message and their announcer was equally taken aback by the sudden presence of a train. At that point the screens changed to "this train is not for public use".

When i got to Flitwick (105L I believe) the train wasn't shown on the departure board either.

My question is how can what is one of the most technologically up to date sections of railway (the Thameslink core) not know about a train about to pass through it? And how can the control team seemingly not know about a journey taking more than an hour and update the departure scree s accordingly?

The train itself was empty in my carriage, maybe 10 people total. There would have been hundreds of spare seats, let alone standing room, for the thousands of people displaced that evening who were constantly being told there were no trains.
 

Bald Rick

Veteran Member
Joined
28 Sep 2010
Messages
29,218
Where are the UPS in the system? I'm not aware of any powering relay interlockings, and even if you backed up the interlocking, unless the main 650v supply feeding the lineside kit is also backed up, there'd be no external kit (signals/points/track circuits) working anyway. Would take a seriously big UPS to do all that.

Certainly one UPS went off line, but as you say it’s not in front of the Relay Interlovkings.
 

aleggatta

Member
Joined
28 Sep 2015
Messages
545
It appears to me, that an over volt protection trip needs to be placed on the incoming supplies(with relevant resilience to transient spikes), to trigger a fail over to the redundant supplies when an over volt happens on one supply (however that is relying on only one supply overvolting at a time!)
 

Belperpete

Established Member
Joined
17 Aug 2018
Messages
1,650
It appears to me, that an over volt protection trip needs to be placed on the incoming supplies(with relevant resilience to transient spikes), to trigger a fail over to the redundant supplies when an over volt happens on one supply (however that is relying on only one supply overvolting at a time!)
A traditional relay interlocking is relatively immune to voltage variations. What I think caused the problems was the electronic remote control systems that connect the interlocking to the control centre - their power supplies would have tripped and needed resetting.

The kind of over-volt trip on the incoming power supplies you propose would need to be faster and more sensitive than the remote control internal power supplies - no point in switching after the remote controls have shut down. This could be technically very difficult, to say the least. However, it would also mean that it would operate far more often. Which would bring its own problems, as the brief power loss to the interlocking caused by the switching would cause all signals to revert to red whenever it happened.

Before someone suggests that could be solved by putting a UPS on the supply, remember that we are talking about 440 or 650 volt supplies. That the interlocking then feeds out trackside to power the point machines, signals and track-circuits.
 

infobleep

Veteran Member
Joined
27 Feb 2011
Messages
12,669
My wife was waiting at London Bridge for the 1641 to Bedford when the power surge happened. Her train hadn't quite got to East Croydon - Traksy showed it being the next train waiting to enter ECR. She decided to head to King's Cross to see if she could get to somewhere like Arlesey instead, but all of those trains also needed to come through the Thameslink core so we're also all delayed or cancelled. She then ran over to Euston and got on a Bletchley bound train around 1750, finally arriving home (after a taxi from Millbrook) at just before 8pm.

My journey home was also interesting. I left work around 7.15 and headed to Farringdon. There was barely anyone there and all the screens were showing the holding "DISRUPTION! Screens will only show trains we know are running" - but with zero trains listed.

I looked on the National Rail app and found a northbound train that was 45 minutes away, and a southbound train that was 5 minutes away. I got on the southbound to Blackfriars (nicer place to wait than Farringdon!) but by that time the northbound train has been cancelled, and the next one according to the platform screens (which weren't in disruption mode) and NR app was something like 50 minutes away.

Out of interest I looked on RTT and could see a VSTP train slowly - VERY slowly! - making its way into London Bridge with stops listed to Bedford.

The announcer on the platform every few minutes was apologising and specifically saying that the boards were accurate and the next train was at least 40 minutes away. Meanwhile this train on RTT was still crawling its way into LBG.

It then arrived into LBG and sat there for a few minutes. As it was 107 late I thought maybe they were going to turn it around and send it back southbound and that the NR app and screens were correct and RTT was wrong. But then it showed as passing some more signals and junctions between LBG and BFF and I did a little whoop to myself when I saw its lights coming round the curve into the station! The announcer was totally stumped and said something like "ladies and gentlemen, I have no idea where this train has come from or where it's going to or stopping at" to which the driver announced it was all stops to STP, then SAC and all stops to BDM. Result!

At Farringdon the screens were still showing the disruption message and their announcer was equally taken aback by the sudden presence of a train. At that point the screens changed to "this train is not for public use".

When i got to Flitwick (105L I believe) the train wasn't shown on the departure board either.

My question is how can what is one of the most technologically up to date sections of railway (the Thameslink core) not know about a train about to pass through it? And how can the control team seemingly not know about a journey taking more than an hour and update the departure scree s accordingly?

The train itself was empty in my carriage, maybe 10 people total. There would have been hundreds of spare seats, let alone standing room, for the thousands of people displaced that evening who were constantly being told there were no trains.
In addditon to that. How is it that one station has its screen in disruption mode but the other doesn't?
 

MarkyT

Established Member
Joined
20 May 2012
Messages
6,257
Location
Torbay
A traditional relay interlocking is relatively immune to voltage variations. What I think caused the problems was the electronic remote control systems that connect the interlocking to the control centre - their power supplies would have tripped and needed resetting.
In a power switch over scenario relay interlockings and basic TDM field units usually happily restart fully functional after signals drop to red or go dark. It is complex processor-based systems that sometimes don't properly reboot automatically, and in the case of SSIs and similar they sometimes blow their software fuses, so UPS systems are neccessary to keep these going through a problem. SSIs are usually all grouped together at the control centre though so there's usually a tech nearby to intervene, unlike relay installations which are distributed around the network at their respective junctions and stations.
Before someone suggests that could be solved by putting a UPS on the supply, remember that we are talking about 440 or 650 volt supplies. That the interlocking then feeds out trackside to power the point machines, signals and track-circuits.
Ideally some kind of power conditioning is required somewhere in the system that could isolate critical complex systems (including the UPS) from the incoming overvoltage supply before switching to the UPS. I doubt if that could be truly continuous though, which could be a problem.

I have never experienced or heard of this kind of incident in my railway career. I'll be very interested to learn how overvoltage occurred in such a huge capacity supply, assuming the equipment was being fed from one of the two traction derived supplies at the time. Did the traction rail voltage also go up significantly, and how did on board systems cope with that? I suppose at least on a train there's always a driver present who can reset such systems if they trip. Overvoltage occurs on smaller scale domestic supplies apparently where there's a lot of small scale feed in; my sister had an appliance failure blamed on this a few years ago.
 

Parham Wood

Member
Joined
13 Jun 2011
Messages
331
Well the failsafe systems on NR equipment seemed to work as they should have. It would have been a major disaster if the overvoltage fried a lot of signalling equipment. The main cause for concern is as stated in the NR briefing why there was such a long power overload from the power company. The power network does seem to be less stable these days, was it an under voltage that tripped trains in North London a few months back? Perhaps it is the mix of different methods of generation with their different times to come on or go off line that causes problems.

Whilst on the subject of UPS (which do not seem appropriate in this situation) I know from experience of using small 240v UPSs to support communication equipment that they caused more outages than any mains failure did on some sites. The sites had generators and these had to be tested monthly. There was something in the switch over to or from the generators that caused the UPSs to lock up in a power off mode from time to time and they needed a manual reset, not good if they were on an unmanned site 40 miles away). It could have been something to do with the power distribution on the site combined with the voltage variations due to the generators switching in or out. They were top range UPSs as well. Hopefully any NR UPSs if deployed will be in a well designed network.
 

Deepgreen

Established Member
Joined
12 Jun 2013
Messages
6,394
Location
Betchworth, Surrey
In a power switch over scenario relay interlockings and basic TDM field units usually happily restart fully functional after signals drop to red or go dark. It is complex processor-based systems that sometimes don't properly reboot automatically, and in the case of SSIs and similar they sometimes blow their software fuses, so UPS systems are neccessary to keep these going through a problem. SSIs are usually all grouped together at the control centre though so there's usually a tech nearby to intervene, unlike relay installations which are distributed around the network at their respective junctions and stations.

Ideally some kind of power conditioning is required somewhere in the system that could isolate critical complex systems (including the UPS) from the incoming overvoltage supply before switching to the UPS. I doubt if that could be truly continuous though, which could be a problem.

I have never experienced or heard of this kind of incident in my railway career. I'll be very interested to learn how overvoltage occurred in such a huge capacity supply, assuming the equipment was being fed from one of the two traction derived supplies at the time. Did the traction rail voltage also go up significantly, and how did on board systems cope with that? I suppose at least on a train there's always a driver present who can reset such systems if they trip. Overvoltage occurs on smaller scale domestic supplies apparently where there's a lot of small scale feed in; my sister had an appliance failure blamed on this a few years ago.
Interestingly (perhaps!) our freezer went into alarm mode just before midnight a few days ago, which I believe was the result of major voltage fluctuation (according to the information I could glean on line about the fault code). It was the first time in the nine years that we've had it that it's done this, and it's been fine again since. We live near Dorking, Surrey. I wonder if there have been some quite widespread voltage blips across Surrey/London during the last week, one of which caused the signal system failure(s)? Probably completely unrelated, but...
 

MarkyT

Established Member
Joined
20 May 2012
Messages
6,257
Location
Torbay
Ideally some kind of power conditioning is required somewhere in the system that could isolate critical complex systems (including the UPS) from the incoming overvoltage supply before switching to the UPS. I doubt if that could be truly continuous though, which could be a problem.
Sorry to reply to my own post but I've thought about that some more and all the extra voltage monitoring device needs to do is to switch off the supply downstream which would be just like a regular power off event as far as the UPS is concerned so it should automatically cut over to battery as usual, subject to its own dependability of course. The excess voltage over timescale threshold of the new device needs to be set correctly to trip at a lower volt-time function than the systems protected downstream (like fuse discrimination) and the input side needs to be significantly overrated voltage-wise so as not to be damaged itself by extreme events. One has to wonder why the supply from the grid itself doesn't come with such protection built in.
 

FOH

Member
Joined
17 Oct 2013
Messages
712
In addditon to that. How is it that one station has its screen in disruption mode but the other doesn't?
Yep as I said, all trains were showing as On Time. As their departure time arrived they just disappeared off the screen one by one
 

Belperpete

Established Member
Joined
17 Aug 2018
Messages
1,650
I have never experienced or heard of this kind of incident in my railway career.
Ditto. 20 seconds is an incredibly long duration for such an event.

Unfortunately, the move away from traditional fossil fuels and towards greener alternatives is making the grid far more unreliable than it used to be. The large rotating generators associated with fossil-fuel-burning plants gave the grid an inertia that solar and similar sources don't. And the plethora of local feed-ins from local solar and wind generation makes it harder to control localised voltage fluctuations. As the move toward greener energy continues, I think things are only going to get worse. Don't get me wrong, I am not advocating a move back to fossil fuels, but the disadvantages of the alternatives do need to be recognised, and money spent on mitigating them.

What I hope is that the level of disruption this "once in a lifetime" event caused doesn't put undue pressure on to "do something" to ensure it can't happen again, if that "something" causes even more disruption both to implement and in its continued operation.
 

Belperpete

Established Member
Joined
17 Aug 2018
Messages
1,650
Sorry to reply to my own post but I've thought about that some more and all the extra voltage monitoring device needs to do is to switch off the supply downstream which would be just like a regular power off event as far as the UPS is concerned so it should automatically cut over to battery as usual, subject to its own dependability of course. The excess voltage over timescale threshold of the new device needs to be set correctly to trip at a lower volt-time function than the systems protected downstream (like fuse discrimination) and the input side needs to be significantly overrated voltage-wise so as not to be damaged itself by extreme events. One has to wonder why the supply from the grid itself doesn't come with such protection built in.
You are effectively proposing what alegatta proposed previously. As you say, this kind of over-volt trip on the incoming main high-voltage power supplies would need to be faster and more sensitive than that on all the down-stream internal electronic power supplies - no point in switching after those have shut down. This would be technically very difficult, to say the least. Worse, such a sensitive trip would likely operate quite frequently, leading to the supply being far less reliable than it is now.

So, you would prevent the "once in a lifetime" shutdown that was experienced. But what you would have replaced it with is something that causes far more frequent momentary power-losses. As you say, the relay interlocking and remote control systems will restart by themselves after these momentary power-losses, but they will all still have led to the relay interlocking replacing all signals to red. You have to be careful that the cure isn't worse than what it is curing.

It is also worth bearing in mind that there are a plethora of other things that could stop all trains in the Gloucester Road Junction area, some far more likely to happen. In relay interlockings of that era, there are some key relays and circuits that if they fail, will bring the whole interlocking to a halt. For example, I seem to recall a button fault on the East Croydon panel ring that stopped all routes being set for a couple of hours.
 

MarkyT

Established Member
Joined
20 May 2012
Messages
6,257
Location
Torbay
You are effectively proposing what alegatta proposed previously. As you say, this kind of over-volt trip on the incoming main high-voltage power supplies would need to be faster and more sensitive than that on all the down-stream internal electronic power supplies - no point in switching after those have shut down. This would be technically very difficult, to say the least. Worse, such a sensitive trip would likely operate quite frequently, leading to the supply being far less reliable than it is now.

So, you would prevent the "once in a lifetime" shutdown that was experienced. But what you would have replaced it with is something that causes far more frequent momentary power-losses. As you say, the relay interlocking and remote control systems will restart by themselves after these momentary power-losses, but they will all still have led to the relay interlocking replacing all signals to red. You have to be careful that the cure isn't worse than what it is curing.

It is also worth bearing in mind that there are a plethora of other things that could stop all trains in the Gloucester Road Junction area, some far more likely to happen. In relay interlockings of that era, there are some key relays and circuits that if they fail, will bring the whole interlocking to a halt. For example, I seem to recall a button fault on the East Croydon panel ring that stopped all routes being set for a couple of hours.

You're quite right, adding extra complexity even with the very best intentions often leads to lower overall availability and new failure modes. As you suggest, ultimately the power grid operators needs to solve the underlying problem. Perhaps the grid needs heavy rotary converters built in that can replace the inertia of the old generators; flywheel storage at various levels in the hierachy also able to absorb any oversupply.
 

aleggatta

Member
Joined
28 Sep 2015
Messages
545
A traditional relay interlocking is relatively immune to voltage variations. What I think caused the problems was the electronic remote control systems that connect the interlocking to the control centre - their power supplies would have tripped and needed resetting.
Just another thought - Would the remote control systems have redundant power supplies(Like in a server, for example)? And could both power supplies be fed from separate supplies, whether that be two DNO supplies or a traction/signalling supply with over volt protection on the primary side shutting down the individual PSU rather than losing the whole piece of kit? (I am aware that this might already be the case, and that both power supplies might have shut down reacting to the same fault on the same DNO circuit)
 

MarkyT

Established Member
Joined
20 May 2012
Messages
6,257
Location
Torbay
Just another thought - Would the remote control systems have redundant power supplies(Like in a server, for example)? And could both power supplies be fed from separate supplies, whether that be two DNO supplies or a traction/signalling supply with over volt protection on the primary side shutting down the individual PSU rather than losing the whole piece of kit? (I am aware that this might already be the case, and that both power supplies might have shut down reacting to the same fault on the same DNO circuit)
In my experience a whole equipment room can be switched from one supply to another, but all systems within are connected only to the active supply. If an individual system had a second power module, I suspect it would just switch the overvoltage to that on failure and of course that would trip as well. If it was already 'hot' it would have blown at the same time anyway. Also we don't know whether the overvoltage was present on only one or all of the available supplies.
 

Jona26

Member
Joined
2 Jan 2013
Messages
273
Location
West Sussex
Power went off again at Gatwick tonight. Trains were non-stopping for a while. NRE said that the generators used previously were not available. Still parked outside though.
 

Hugo Smith

New Member
Joined
25 Sep 2019
Messages
3
Location
Dorking
But in the detail of the NR text it says,


“Three areas of signal control shut themselves down:

  • East Croydon
  • Selhurst
  • Norwood Junction
The signalling power supply at Streatham also shut itself down.”

So that rules out the Streatham Junctions and the route via Mitcham Junction for a time at least.

They should’ve done an Epsom to Horsham (Possibly only to Dorking) shuttle as they have done on the very rare occasion
 

tsr

Established Member
Joined
15 Nov 2011
Messages
7,400
Location
Between the parallel lines
Power went off again at Gatwick tonight. Trains were non-stopping for a while. NRE said that the generators used previously were not available. Still parked outside though.

Basically, the generator is occasionally shutting down (to be investigated...) but can't restart due to the immediate load from the escalators and lifts around the station. It's still physically there, it just won't boot up, and it's not quite as simple as just unplugging various things and hoping for the best, as only very very few contractor staff have competency for the switching system.

This is not specifically to do with the Croydon issues but does relate to UK Power Networks still being unable to supply power to the station for some considerable time now - basically several days. This is virtually unprecedented for a major GTR station of this level of importance. UKPN also appear to have been involved in a few other massive railway mishaps recently, as well as some smaller ones, some not mentioned on here. I can imagine they are facing serious questions from Network Rail at director / CEO level.
 

TrainGeekUK

Member
Joined
15 Jun 2019
Messages
109
Absolute chaos at Gatwick again.. platforms 3-6 are out of commission and only a handful of trains stopping. Just got back from there and it’s a nightmare. I feel sorry for the people trying to start their Xmas getaways with everything that’s gone on today with regard to the weather in Sussex.
 

Jona26

Member
Joined
2 Jan 2013
Messages
273
Location
West Sussex
Basically, the generator is occasionally shutting down (to be investigated...) but can't restart due to the immediate load from the escalators and lifts around the station. It's still physically there, it just won't boot up, and it's not quite as simple as just unplugging various things and hoping for the best, as only very very few contractor staff have competency for the switching system.

This is not specifically to do with the Croydon issues but does relate to UK Power Networks still being unable to supply power to the station for some considerable time now - basically several days. This is virtually unprecedented for a major GTR station of this level of importance. UKPN also appear to have been involved in a few other massive railway mishaps recently, as well as some smaller ones, some not mentioned on here. I can imagine they are facing serious questions from Network Rail at director / CEO level.

Thanks for the explanation- appreciated.
 

Belperpete

Established Member
Joined
17 Aug 2018
Messages
1,650
Just another thought - Would the remote control systems have redundant power supplies(Like in a server, for example)? And could both power supplies be fed from separate supplies, whether that be two DNO supplies or a traction/signalling supply with over volt protection on the primary side shutting down the individual PSU rather than losing the whole piece of kit? (I am aware that this might already be the case, and that both power supplies might have shut down reacting to the same fault on the same DNO circuit)
Most modern remote control systems do have a second PSU, but this is in case the first one fails. As MarkyT says, they are both fed from the interlocking power supply, on the basis that if the interlocking hasn't got any power then there is no point in having any remote control.

The other thing to bear in mind is that, on the old Southern Region, the switching between main power supplies was not done in the interlocking, it is usually done in the electrical substation under the control of M&E (or whatever they are called these days - they have a SCADA system for monitoring and remote safe-switching of power supplies, S&T don't). This means there is usually only one power supply present in the interlocking. So it would not be a simple matter to provide two separate supplies for the remote control.

Note: there are usually two feeder cables from the sub-station to the interlocking, but the second cable is only there in case of a fault on the first cable. Both cables are usually fed from the same supply at the sub-station. Switching from one incoming cable to the other in the interlocking is done manually, by pulling and inserting fuses. Again, S&T don't have a system for remote safe-switching of power supplies.

The above applies in the third-rail areas on the Southern. Obviously things are different where there is no traction supply, and so no sub-stations to get power from.
 

Deepgreen

Established Member
Joined
12 Jun 2013
Messages
6,394
Location
Betchworth, Surrey
I dared to venture out today and the signals promptly fell over again - Redhill trains were diverted via Quarry for a time. I had to think on my feet and catch a reversing Southern train (turned short of Reigate) but others, when confronted with the screens simply saying everything was cancelled, appeared to be giving up. Apparently, things ran through again after ten or fifteen minutes! I have to say, for the extraordinary number of engineering works and money spent on the BML, it still seems disturbingly fragile.
 

Bald Rick

Veteran Member
Joined
28 Sep 2010
Messages
29,218
Basically, the generator is occasionally shutting down (to be investigated...) but can't restart due to the immediate load from the escalators and lifts around the station. It's still physically there, it just won't boot up, and it's not quite as simple as just unplugging various things and hoping for the best, as only very very few contractor staff have competency for the switching system.

This is not specifically to do with the Croydon issues but does relate to UK Power Networks still being unable to supply power to the station for some considerable time now - basically several days. This is virtually unprecedented for a major GTR station of this level of importance. UKPN also appear to have been involved in a few other massive railway mishaps recently, as well as some smaller ones, some not mentioned on here. I can imagine they are facing serious questions from Network Rail at director / CEO level.

I’ve heard it suggest that the main fault at Gatwick is in a cable directly underneath the flight path, repairing it is a bit tricky one imagines!
 
Status
Not open for further replies.

Top