Overrun of planned works CRQ 65698 causing a loss of service::WO0000000054310
MINOR Closed TTB-Outages
STATUS
Closed
TYPE
TalkTalk Outage
STARTED
Sep 15, 05:38 AM (7 days ago)
CLOSED
Sep 15, 03:40 PM (6½ days ago)
REFERENCE
37301 / INC13556157
INFORMATION
  • INITIAL
    7 days ago

    Summary
    It has been reported that the planned works CRQ 65698 upgrade on layer 2 DSL at telehouse north is delayed. This is causing a loss of service to a number of big partner estates to be down.
    
    N/A
                              

  • UPDATE
    7 days ago

    Latest Update A bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. NOC are collating a detailed list of impacted customers by partner. Senior Engineers remain on site and are working to resolve the issue.  

  • UPDATE
    6¾ days ago

    Latest Update A bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. It has been agreed to roll back the software upgrade which started at 08:25hrs and is expected to take approximately 50 minutes to complete. Further updates on the progress of the roll back and confirmation of restoration will continue to be provided.  

  • UPDATE
    6¾ days ago

    Latest Update A bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. It has been agreed to roll back the software upgrade which started at 08:25hrs and is expected to take approximately 50 minutes to complete. Further updates on the progress of the roll back and confirmation of restoration will continue to be provided.  

  • UPDATE
    6¾ days ago

    Latest Update The bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. The roll back of the software upgrade has been completed on 4 switches currently with 1 switch remaining. Further updates on the progress of the roll back on the final switch along with the next steps for restoring and testing service restoration will continue to be provided.  

  • UPDATE
    6¾ days ago

    Latest Update The bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. The roll back of the software upgrade has been completed on 4 switches currently with 1 switch remaining. Further updates on the progress of the roll back on the final switch along with the next steps for restoring and testing service restoration will continue to be provided.

  • UPDATE
    6¾ days ago

    Latest Update The bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. The roll back of the software upgrade has been completed on the 5 switches. Network Support have brought up some internal services (ISIS Routing protocols) and are currently completing some more. Once confirmed as stable the work will begin to bring up the internal and Partners facing interfaces in a controlled manner. No confirmed ETA can be provided at this time however further updates on the progress of these steps for restoring and testing service restoration will continue to be provided.  

  • UPDATE
    6¾ days ago

    Latest Update The bridge call is ongoing between our NOC, Core Network operations & Network Service Ops Management. Several partners services have been brought back up and confirmed as working since approximately 11:00hrs. The Remaining Partners services are continuing to be brought up in a controlled manner while the associated Network equipment in being monitored. Further updates on the progress of the restoration work for the remaining Partners will continue to be provided.  

  • UPDATE
    6¾ days ago

    Latest Update The bridge call is still open between our NOC & Assurance teams. The majority of Partners services have now been brought back up and traffic levels across our Wholesale platform have been confirmed as BAU. Investigations are ongoing with Assurance and NOC teams into 3 individual ports impacting 3 Partners. Updates on the progress of these investigations will continue to be provided.  

  • UPDATE
    6¾ days ago

    Latest Update The bridge call is still open between our NOC & Assurance teams. All Partners services have now been brought back up and traffic levels across our Wholesale platform have been confirmed as BAU. Investigations are ongoing with Assurance and NOC teams into 1 individual port impacting 1 Partner. This may be caused by an unrelated issue but this needs to be confirmed before restoration. Updates on this final aspect will continue to be provided.  

  • RESOLUTION
    6½ days ago

    Technical / Suspected Root Cause Due to planned works the TTB Wholesale DSL platform in Telehouse North was out of service This was impacting 30 Wholesale Partners who solely route via Telehouse North. Partners with resilient services were not impacted and will have rerouted during the planned work. Across the Wholesale Base (~190k customers) around 9,000 customers were without data services. Partner services have been brought back up and traffic levels across our Wholesale platform have been confirmed as BAU since approximately 13:19hrs. This incident was caused due to planned works CRQ 65698. The associated software upgrades as part of the change were rolled back to restore service. The incident will now be resolved as all Partners services are available and BAU with no further issues reported or identified.  

  • Closed