Latency still a significant hurdle for remote production - APB+ News

APB+ News

APB AWARDS - NOMINATE NOW!

[master-leader-web]
[master-leader-mob]

Latency still a significant hurdle for remote production

Add Your Heading Text Here

By Joy Tang

When the speed of data transmission is low, the broadcast of a remotely-produced event can be badly affected. Viewers are used to high-speed connections, and they can be more unforgiving.

“The
biggest hurdle to widespread adoption of remote production is the challenge of
latency – especially in the context of premium live sports,” observed Greg de
Bressac, Grass Valley’s Vice President Sales, APAC.

“With
most fans also following along on social media, even a single-second delay is
unacceptable.

“Audiences
want to feel as close to the action – if not closer – as they would in the
venue, so there’s a strong drive towards higher resolutions and more camera
angles to deliver an ever-richer fan experience.

“In
turn, this will put more significant stress on the network and available
bandwidth.”

Better encoding

One
workaround is to use technology that enables less data to be sent over the
network to achieve the same outcome. According to de Bressac, more efficient
encoding technology such as JPEG2000, JPEG-XS and MPEG offers an attractive
alternative.

Such
technology can deliver ultra-low delays, comparable to transporting the signal
over fibre, at lower-cost and still maintain the viewers’ quality of
experience.

Grass
Valley’s own Direct IP technology, which delivers reliable live signals across
distances of up to 20,000 kilometres, is another solution to latency concerns.

“The
use cases we have for Direct IP clearly demonstrate that at-home production is
a viable and highly-effective alternative to a traditional outside broadcasting
(OB) approach,” de Bressac said.

“We
have seen a lot of momentum towards the adoption of IP infrastructures and workflows
in the region, which underpin and open the way for remote production. Outside
broadcast companies and production services providers are also gearing up to
support a summer of sports that will involve a significant remote component in
order to ensure staff safety.”

Distributed workflows

Grass
Valley is already working with several customers in the region to introduce
workflows that support remote and centralised work.

“We are
working with a customer to set up a 100% remote production solution for live sports
production in the Oceania region with only cameras and camera operators at the
venues,” de Bressac shared.

“The
production infrastructure will be at a different location and the control
surfaces and operators will be at yet another location or locations. This
distributed workflow will change the game.”

Trade-offs needed

Kevin
Mockford, MediaKind’s Director, Contribution Processing & Product
Management, also identified latency as one obstacle to remote production, but
emphasised that there are many aspects of remote production, each of which with
its own challenges and success factors.

“For example,
the remote production of camera-pan and zoom shoots requires ultra-low-latency
network infrastructures. This can then have a significant knock-on impact on
the compression technology used (if used at all), the required connectivity
bandwidth, and therefore, the cost," he pointed out.

At the
same time, rules of thumb may not apply, Mockford added. “There is often an
assumption that sub-100 millisecond transmission latency is required for a
remote production. However, this is not always the case," he noted.

"In
fact, remote production has even been successfully carried out using satellite
links, which typically provides transmission latency of 500 milliseconds or
greater. So, broadcasters should carefully consider the trade-offs in terms of
what is handled remotely and what is not to ensure they’re getting the best
compromise of cost and performance."

Working in tandem

Synchronisation
for cloud-based workflows is another consideration.

“Some
of the main challenges we’ve seen involve complexities surrounding the move
from ‘traditional’ remote production, where all the video and audio feeds are
transmitted from the venue to a central production facility, to a ‘distributed
production'," Mockford said.

"In
this instance, the production team is not co-located, and the production tools
are likely to be software running in a public cloud instance. The challenge
here is getting the live video, audio and data into the public cloud reliably,
with sufficiently low delay, while maintaining synchronisation between them
all."

Mockford
further elaborated that reliability covers both error-free transmission and
protection against complete failure of the link. "Protocols such as SRT
(Secure Reliable Transport) can provide error correction, and the use of
separate network connections can protect against total failure of the network
connection.

"Timing
synchronisation challenges can include having the same system clock reference
across the whole production workflow and maintaining frame alignment across all
the video feeds."

New solutions needed

Mockford
said the SMPTE ST 2110 standards and its use of PTP (Precision Time Protocol)
SMPTE 2059 is one approach to dealing with the challenges. "It is perhaps
a very traditional approach, taking the established practices from the world of
SDI workflows and adapting it for IP-based workflows.

"However,
it is not very easy to implement in a public cloud environment. So, an
alternative approach using less onerous real-time timing requirements while
still delivering the needed synchronisation is key. MediaKind among others are
developing techniques to do this," he shared.

Join The Community

Join The Community

Latency still a significant hurdle for remote production

Add Your Heading Text Here

By Joy Tang

When the speed of data transmission is low, the broadcast of a remotely-produced event can be badly affected. Viewers are used to high-speed connections, and they can be more unforgiving.

“The
biggest hurdle to widespread adoption of remote production is the challenge of
latency – especially in the context of premium live sports,” observed Greg de
Bressac, Grass Valley’s Vice President Sales, APAC.

“With
most fans also following along on social media, even a single-second delay is
unacceptable.

“Audiences
want to feel as close to the action – if not closer – as they would in the
venue, so there’s a strong drive towards higher resolutions and more camera
angles to deliver an ever-richer fan experience.

“In
turn, this will put more significant stress on the network and available
bandwidth.”

Better encoding

One
workaround is to use technology that enables less data to be sent over the
network to achieve the same outcome. According to de Bressac, more efficient
encoding technology such as JPEG2000, JPEG-XS and MPEG offers an attractive
alternative.

Such
technology can deliver ultra-low delays, comparable to transporting the signal
over fibre, at lower-cost and still maintain the viewers’ quality of
experience.

Grass
Valley’s own Direct IP technology, which delivers reliable live signals across
distances of up to 20,000 kilometres, is another solution to latency concerns.

“The
use cases we have for Direct IP clearly demonstrate that at-home production is
a viable and highly-effective alternative to a traditional outside broadcasting
(OB) approach,” de Bressac said.

“We
have seen a lot of momentum towards the adoption of IP infrastructures and workflows
in the region, which underpin and open the way for remote production. Outside
broadcast companies and production services providers are also gearing up to
support a summer of sports that will involve a significant remote component in
order to ensure staff safety.”

Distributed workflows

Grass
Valley is already working with several customers in the region to introduce
workflows that support remote and centralised work.

“We are
working with a customer to set up a 100% remote production solution for live sports
production in the Oceania region with only cameras and camera operators at the
venues,” de Bressac shared.

“The
production infrastructure will be at a different location and the control
surfaces and operators will be at yet another location or locations. This
distributed workflow will change the game.”

Trade-offs needed

Kevin
Mockford, MediaKind’s Director, Contribution Processing & Product
Management, also identified latency as one obstacle to remote production, but
emphasised that there are many aspects of remote production, each of which with
its own challenges and success factors.

“For example,
the remote production of camera-pan and zoom shoots requires ultra-low-latency
network infrastructures. This can then have a significant knock-on impact on
the compression technology used (if used at all), the required connectivity
bandwidth, and therefore, the cost," he pointed out.

At the
same time, rules of thumb may not apply, Mockford added. “There is often an
assumption that sub-100 millisecond transmission latency is required for a
remote production. However, this is not always the case," he noted.

"In
fact, remote production has even been successfully carried out using satellite
links, which typically provides transmission latency of 500 milliseconds or
greater. So, broadcasters should carefully consider the trade-offs in terms of
what is handled remotely and what is not to ensure they’re getting the best
compromise of cost and performance."

Working in tandem

Synchronisation
for cloud-based workflows is another consideration.

“Some
of the main challenges we’ve seen involve complexities surrounding the move
from ‘traditional’ remote production, where all the video and audio feeds are
transmitted from the venue to a central production facility, to a ‘distributed
production'," Mockford said.

"In
this instance, the production team is not co-located, and the production tools
are likely to be software running in a public cloud instance. The challenge
here is getting the live video, audio and data into the public cloud reliably,
with sufficiently low delay, while maintaining synchronisation between them
all."

Mockford
further elaborated that reliability covers both error-free transmission and
protection against complete failure of the link. "Protocols such as SRT
(Secure Reliable Transport) can provide error correction, and the use of
separate network connections can protect against total failure of the network
connection.

"Timing
synchronisation challenges can include having the same system clock reference
across the whole production workflow and maintaining frame alignment across all
the video feeds."

New solutions needed

Mockford
said the SMPTE ST 2110 standards and its use of PTP (Precision Time Protocol)
SMPTE 2059 is one approach to dealing with the challenges. "It is perhaps
a very traditional approach, taking the established practices from the world of
SDI workflows and adapting it for IP-based workflows.

"However,
it is not very easy to implement in a public cloud environment. So, an
alternative approach using less onerous real-time timing requirements while
still delivering the needed synchronisation is key. MediaKind among others are
developing techniques to do this," he shared.

Join The Community

Stay Connected

Facebook

101K

Twitter

3.9K

Instagram

1.7K

LinkedIn

19.9K

YouTube

0.2K

Subscribe to the latest news now!

 

    Scroll to Top