Skip to main content

TM Broadcast International #150, February 2026

Page 1


SHOWS

Thestrategicroleof ISE intoday’sbroadcast market

INDUSTRY VOICES

EAMONNCURTIN, CCOof GravityMedia

EXPLAINER OBunits broadentheir scope

TEST AREA

Blackmagic’s

CollaborativeEnvironment

EDITORIAL

ISE offers insight into the direction of broadcast

The latest edition of ISE once again broke records in terms of visitors, exhibitors and exhibition space. TM BROADCAST was present, as every year, engaging in conversations across stands, corridors and conference rooms with some of the leading players in the professional audiovisual ecosystem.

One of the underlying questions we reflected on was: what strategic relevance does ISE hold today for the broadcast market? The message we gathered was virtually unanimous. ISE has consolidated its position as a space where broadcast is no longer viewed in isolation, but rather as part of a broader, hybrid and global audiovisual ecosystem. This, precisely, is its distinctive appeal.

If it had to be summarised in a single word, that word would be convergence. On the one hand, sectors traditionally removed from broadcast — corporate, live events, institutional environments — are now demanding production values comparable to those of television. Multicamera streaming, cinematic aesthetics, remote workflows… what was once exclusive to the broadcast domain has become increasingly accessible to other verticals.

Editor in chief

Javier de Martín editor@tmbroadcast.com

Creative Direction

Daniel Esparza press@tmbroadcast.com

Editorial Staff

Bárbara Ausín

Carlos Serrano

Key account manager

Patricia Pérez ppt@tmbroadcast.com

On the other hand, IT logic is establishing itself as a common language. IP, software-defined architectures, cloud and virtualisation are no longer the exclusive domain of large production centres. Their influence now extends across the entire professional audiovisual landscape, albeit adapted to environments that require greater operational simplicity.

And here a third key element emerges: the need for more intuitive and accessible solutions. The market is no longer composed solely of highly specialised broadcast engineers. New professional profiles are operating advanced technologies and demand tools that are powerful yet user-friendly, secure yet agile. Artificial intelligence — from autofocus to automatic tracking and process automation — is accelerating this democratisation.

Ultimately, content production is expanding in volume, quality and diversity of stakeholders. More content is being produced, from more locations and for more platforms. And what once required complex infrastructures can now, in certain contexts, be deployed under different models — more flexible and scalable. ISE thus stands as the space where broadcast can anticipate where the industry’s centre of gravity is shifting. At TM BROADCAST, we will continue to follow this evolution closely.

Graphic Design and Layout

Sorex Media

Ana Guijarro

Administration

Laura de Diego

administration@tmbroadcast.com

Published in Spain ISSN: 2659-5966

International #150 February 2026

SUMMARY

6 News

20 SHOWS

Post-ISE

special: What strategic relevance does the show hold today for the broadcast market?

We gathered insights from a selection of broadcast exhibitors, representing different profiles and company sizes, to analyse the role ISE currently plays in their strategies and the main themes that shaped the conversation at this year’s edition

30 TRENDS

C2PA: When credibility is more than a value — is it possible to certify truthfulness?

In the face of the flood of AV content created with AI — and not always with good intentions — is there any formula to distinguish what is authentic from what is not?

INDUSTRY VOICES

Eamonn

Curtin, CCO of Gravity Media

“We need to look beyond traditional markets because of new players entering the space”

TEST AREA

EXPLAINER

Audiovisual production on wheels:OB units broaden their scope

Production mobile units are increasingly present in new production environments beyond traditional sporting events: from television series to corporate presentations or concerts.

Blackmagic’s Collaborative Environment:

Much more than a cloud with DaVinci Resolve

Versatile enough as to adapt to a multitude of environments and situations, let’s see what functionalities are offered and how to maximize the benefits of this ecosystem

ISE 2026 wraps up a record edition with 92,170 visitors, up 8% on 2025

Integrated Systems Europe (ISE) 2026 has officially closed its doors following four days dedicated to the AV and systems integration industry. ISE 2026 broke all records welcoming 92,170 visitors from around the world. A total of 1,751 exhibitors, including 323 making their debut at the event, contributed to the largest total show floor space in its history at 101,000 sqm, as it has claimed in a statement.

By Tuesday, the event welcomed 55,156 unique attendees, a 10% increase compared with Tuesday 2025 (49,981). By Wednesday, total unique attendance across all show days reached 76,035, while the show recorded its largest single day ever, with 64,198 visitors onsite, almost equalling the pre-pandemic record attendance of an entire ISE show (64,908 at ISE 2019 in Amsterdam). Momentum continued through Thursday, as total unique attendance climbed to 87,648, officially marking the highest attendance

in ISE history. Overall, ISE 2026 achieved a 8% increase in total attendance compared with 2025. The total number of registrations reached 120,914 with over 212,000 visits across the four days.

“As ISE 2026 comes to a close, I’m truly inspired by the passion and strength of our community”, shares Mike Blackman, Managing Director of Integrated Systems Events. “Over four extraordinary days, we celebrated groundbreaking technology, ignited bold ideas, forged lasting connections, and set new benchmarks for our industry. What excites me most is the creativity, energy, and diversity of our exhibitors and partners, and the unwavering dedication of the ISE team that makes it all possible”.

ISE Community

At ISE 2026, Spark debuted as a show for creativity and technology . The new event format brought together professionals from Broadcast,

Live Events, Marketing, Design, and Gaming into one immersive experience. Spark was designed as a hub to connect, share ideas, and explore the future of creativity. With partners and speakers, Spark 2026 showcased how technology transformed the audience experience, on screen, in physical spaces, and live.

In another first, ISE announced the ISE Foundation, an initiative spearheaded by ISE, and backed by co-owners AVIXA and CEDIA, with support from the City of Barcelona and the Government of Catalonia. The ISE Foundation aims to empower the AV and systems integration community, with an emphasis on teamwork, innovation, and lasting impact under the tagline “Powering On, Together”. The foundation launched at a press conference, with distinguished speakers David Labuskes, CEO of AVIXA; Daryl Friedman, Global President & CEO of CEDIA; Raquel Gil, Deputy Mayor of Barcelona; and Miquel Sàmper, Minister for Business and Labour, Government of Catalonia.

From Landmarks to AI Insights

The keynote sessions at ISE 2026 were delivered by Matt Clark and Sol Rashidi.

Clark took attendees “Behind the Façade: Building a Performanceled Mapping at Casa Batlló, from Concept to Implementation”, revealing the creative and

technical mastery behind the mapping at one of Barcelona’s most iconic landmarks. Rashidi’s “The AI Reality Check: What It Takes to Scale and the Future of Leadership” offered a compelling look at how AI is reshaping industries and the leadership needed to navigate this transformation.

International Officials

The 2026 show featured a delegation of VIPs and government officials, including representatives from Latin American countries. Attendees included leaders in national and regional governments, as well as experts in digitalisation and AI policy.

“ISE jumpstarts every year as the largest gathering of AV professionals. But beyond the numbers is a more remarkable story: people connecting with one another from different corners of the world and exchanging ideas that will live on past the convention center walls”, explains David Labuskes, CTS, CAE, RCDD, Chief Executive Officer of AVIXA.

“The week was full of brilliant minds that push this exciting industry forward. AVIXA is proud to collaborate with the ISE and CEDIA teams to deliver such a vibrant forum”.

“ISE 2026 once again demonstrated the extraordinary power of collaboration across our

global technology ecosystem”, adds Daryl Friedman, Global President and CEO of CEDIA.

“As co-owners of ISE, we are proud to see the event continue to expand its influence as a platform where innovation, education, and partnership converge. The energy on the show floor and throughout the conference programme reflects a thriving industry that is not only embracing emerging technologies, but shaping how they enhance the spaces where people live, work, and connect”.

ISE 2027

ISE will return for ISE 2027 from 2 – 5 February 2027 at Fira de Barcelona Gran Via. 

NATO modernises its main studio with full camera system upgrade

Grass Valley has won a NATO-wide tender to provide the new camera system for NATO’s main broadcast studio at its Brussels headquarters. The project, delivered in partnership with VP Media Solutions, aims to provide a full upgrade on the facility’s incumbent camera system, as the company has claimed in a statement.

The deployment includes five Grass Valley LDX 135 studio camera channels, with the intention of supporting daily world-wide press conferences and broadcast operations from NATO’s headquarters. The system has already been fully installed and is now in live operation.

Grass Valley’s Belgian integration partner VP Media Solutions led the tender response and was responsible for system design, installation, commissioning, training coordination and ongoing support. VP Media Solutions also managed logistics and on-site delivery, including

security screening prior to equipment entering NATO’s broadcast facilities.

The tender specified requirements for a 4K/HD portable studio camera system with SMPTE fiber connectivity and compatibility with existing baseband SDI workflows.

In cooperation with Bart Vandendorpe, Head Broadcast at NATO, Grass Valley’s LDX 135 cameras were selected as they met the technical requirements. The set-up, including Creative Grading, the company’s graphical camera shading solution, is designed to enhance operation and integration into the existing studio environment.

“This project required precise coordination and careful planning due to the operational and security constraints of the site”, affirms Emmanuel Charlet, CEO of VP Media Solutions. “By working closely with NATO and Grass Valley, we were able to

deliver the full upgrade within the required timeframe, while ensuring a smooth handover to the broadcast team”.

The system is currently operating in a baseband SDI architecture, with a pathway to IP-based workflows through optional NativeIP licensing, supporting NATO’s future infrastructure plans.

“Establishing a new partnership with NATO following a formal NATO-wide tender is a significant milestone for Grass Valley, and we’re delighted to be supporting Bart Vandendorpe and the team with their production capabilities”, concludes Rene Hueber, Director of Global Channel Sales at Grass Valley. “Working alongside VP Media Solutions, this project demonstrates the flexibility and efficiency of the LDX platform in meeting strict technical specifications while delivering clear operational and image-quality benefits”. 

Germany will require streamers and broadcasters to invest in local production

Germany will introduce investment requirements for streaming platforms and TV broadcasters and nearly double government funding for the domestic film industry. The Culture Minister Wolfram Weimer

explains that the objective of this is to boost the country’s appeal as a film-making hub, as Reuters has claimed in a statement.

“This is not a symbol, but a real investment stimulus: for jobs, value creation, and creative excellence”, Weimer affirms a week before the German capital welcomes industry players for the Berlin Film Festival (12 and 22 February 2026).

Streaming services such as Netflix and Amazon, as well as major broadcasters, will be obliged to reinvest at least 8% of the annual revenue they earn in Germany back into the local industry.

Sky News Australia inaugurates new studios and announces rebranding

Sky News Australia has marked its 30-year anniversary with the official opening of its new studios in Surry Hills, after journalists, news bosses, and TV and production crew moved into the building at the end of last year. It has also conffirmed it will rebrand as News24 this year, as it has claimed in a statement.

Officially opened by Prime Minister Anthony Albanese and NSW Premier Chris Minns, the facility brings together broadcast and publishing teams with the

objective of creating one of Australia’s largest multimedia journalism hubs.

Featuring eight studios, six control rooms, expanded edit suites, voiceover booths and an expansive newsroom, the centre supports more than 1,500 hours of content each week across broadcast and digital platforms.

CEO Paul said the new studios marked the “largest transformation project” in the company’s three-decade history.

However, the measures state that if the streamers and broadcasters opt to invest 12% or more, they will be exempt from complex regulations obliging them to, for example, produce films in the German language.

With the measures, Germany will join at least a dozen other European countries, including France and Italy, that now obligate streamers to invest in domestic production.

In addition, the government has agreed to increase funding for film production to 250 million euros ($295.2 million) a year, nearly double the previous level. 

“This new facility sets us up to deliver on our next phase of growth, expanding our commitment to Australian journalism while taking our offering to larger audiences in more markets”, he affirms.

News Corp Australasia Executive Chairman Michael Miller adds: “This studio isn’t simply about broadcasting. It’s about connection, collaboration and the relentless pursuit of stories that inform, challenge and inspire”. 

UK could fully transition to internet-delivered TV in the 2030s, Sky research suggests

Sky has published a new independent research, carreid out by Oliver & Ohlbaum Associates (O&O), which explores the future of TV in an increasingly internet-based world. ‘Stream On: The Future of UK TV’ suggests that the UK can move fully to internet-delivered TV in the 2030s, with only around 330,000 (2.2%) households left to help over the line – if Government sets a clear timetable and invests in targeted help for those most at risk of digital exclusion, as Sky has claimed in a statement.

Drawing on a nationally representative survey of 1,000 UK TV viewers, in-depth consumer workshops and expert interviews, the report concludes that:

› 93% of users – rising to 99% among those aged 70+ – find useful the features of internetdelivered TV useful.

› The latest generation of voice control has improved accessibility for older or disabled audiences. Future developments are expected to include interactions like natural conversations or real-time audio-description and live captions.

› Consumers are excited about innovations that enhance the TV experience and make TV different from other activities. Developments that simplify the experience were also welcome, such as bringing all

subscriptions and services together in one TV interface.

› While many viewers have already made the switch – 94% of UK adults have internet at home and 92% use a videoon-demand (VOD) service –by the mid-2030s, most viewers will have fully adopted internet-delivered TV.

› A ‘nightlight’ DTT or satellite service would be costly and little used, with minimal audience demand.

› Consumers would prefer Government to focus on digital inclusion – skills and affordability – than maintaining a legacy broadcast system.

The publication comes as DCMS and Ofcom consider options for the future of TV distribution and the potential retirement of Digital Terrestrial Television (DTT) during the 2030s.

Viewers back a connected future – if everyone can come with them

The research finds that internet-delivered TV is already transforming the viewing experience, particularly for older and disabled audiences:

› 93% of connected TV users value features such as pause, rewind and watch from the beginning.

› Among over-70s this rises to 99%, and they are often more likely than younger viewers to rate features as “very useful”.

› Accessibility tools – including voice control and improved subtitling and audio description –are described by experts as “transformative” for some older and disabled viewers.

Audiences are equally clear about what they want next: better content discovery, strong safety features and a trusted,

family-friendly environment. It’s important too, they highlight, that the increase in choice comes with simplification – audiences want simple interfaces with a variety of content shown in one place, akin to the traditional TV guide, to ensure they can easily discover their next watch.

Nick Herm, Group Chief Operating Officer at Sky, explains:“This research shows that modern TV and social inclusion can go hand in hand. A full move to internet-delivered TV in the 2030s is achievable –and it can help close the digital divide rather than deepen it. With most people already streaming, an investment from the Government in skills and affordable connectivity for the relatively small number of households who still need help to get online will have benefits far beyond TV, while saving hundreds of millions on maintaining legacy systems”.

Another findings

Examining the technological, cultural and economic factors that are reshaping how UK audiences across are consuming content, findings from ‘Stream On: The Future of UK TV’ include:

Only 330,000 households left by 2034 – if action is taken on digital inclusion

Building on DCMS-commissioned forecasts, O&O model the impact of a clear Government decision to move towards an “IP TV switchover”:

› In 2023, there were 3.9 million households not using internet-delivered TV.

› Without action, DCMS modelling suggests 1.8 million could remain unconnected in 2035.

› But evidence from the 2012 digital switchover and international experience, shows that a clear announcement around 2027, coupled with effective public communication, could reduce this to around 330,000 households by 2034.

These remaining households are more likely to be older, lowerincome or disabled – groups who already experience digital exclusion across public services, work, healthcare and banking. The report argues that helping them to connect should be seen as part of the UK’s wider digital inclusion agenda, not a TV-specific fix.

Audiences do not want an interim solution

While projections suggest almost all audiences will have naturally transitioned to IPTV by 2035, the industry, Government and regulators all have an important responsibility in ensuring everyone can embrace IPTV.

The research indicates that audiences would prioritise support for vulnerable groups to transition to internet-enabled TV over investment in a ‘nightlight’ broadcast service, with 72% of workshop participants preferring support with connected TV skills and affordability for those who need it.

Older audiences are embracing the switch

Audiences across demographics are positive about the future of TV, with 93% of IPTV users valuing

at least one of the features it provides, such as ‘watch from the beginning’ and ‘pause and rewind’.

While younger demographics have traditionally been seen to lead the shift to IPTV, older users find it even more useful. 99% of IPTV users aged over 70 value at least one of the features it provides.

Choice and personalisation at the core

The greatest drivers to IPTV were found to be its ability to provide greater choice and a wider range of content and services. Audiences enjoy features that enhance the core TV viewing experience, with flexibility enhancing features allowing them to watch when they want scoring highly.

Going forward, audiences are most excited about new features that allow them to curate their experience, such as subscription bundling; a single, aggregated user interface; and greater personalisation, along with simplification to reduce the overwhelm of multiple apps.

AI will play a transformative role

Expert insight reveals that AI and visual rendering will increasingly change how audiences interact with their TVs. Advancements in technology will provide opportunities for hyper-personalised viewing – from tracking a favourite footballer’s positioning and in match-interactions to taking a virtual players seat in a favourite game show, content consumption will increasingly transition to co-creation. 

SMPTE launches call for technical papers for 2026 Media Technology Summit

SMPTE has announced its call for technical papers for the SMPTE 2026 Media Technology Summit. The Summit will be held Nov. 16-19 at the Pasadena Convention Center, 300 E Green St, Pasadena, CA, as it has claimed in a statement.

The Summit will aim to present the latest innovative processes, tools, workflows, interoperability solutions, standards, and other initiatives designed with the intention of driving the industry forward. Original, unpublished manuscripts aligning with these goals will undergo multi-peer review when submitted as abstracts. Abstracts should fall within the range of 300 to 400 words and will be accepted until May 31.

Furthermore, SMPTE has included a journal track, where interested parties can submit full, publication-ready manuscripts that will automatically be considered for publication in the SMPTE Motion Imaging Journal, subject to a separate post-event peer-review process. Manuscripts must be 3,000 to 5,000 words and will be accepted until May 31.

“The SMPTE Media Technology Summit is where the brightest minds in our industry come together to shape the future of media and entertainment”, sharesd Juan Reyes, co-chair of the 2026 Media Technology Summit and President of Tech

Align Group. “As co-chair of the event, I encourage experts from across the media and entertainment ecosystem to submit new ideas, breakthrough research, and practical insights that will educate and inspire a global audience of industry professionals”.

The conference program committee will notify authors of decisions by June 30. Authors of selected papers will be granted the opportunity to present at the world’s premier peer-reviewed forum dedicated to the exploration of media and entertainment technology.

Paper topics

Paper topics can include, but are not limited to:

Color Science

› Color Vision and Adaptation for Motion Imaging

› Color Capture and Reproduction

› Color Correction, Encoding and Image Processing

› Emerging Color Pipelines for Motion Imaging

Emerging & Disruptive Technologies

› Artificial Intelligence & Automation

› 5G and Next-Generation Connectivity

› Blockchain and Decentralized Media Systems

› Robotics, Control Systems & Machine Vision

Content Creation & Production Innovation

› Virtual, Augmented & Mixed Reality Production

› Immersive and Interactive Media Experiences

› Next-Generation Content Creation Workflows

Media Infrastructure & Distribution

› Cloud and Edge Computing Solutions

› IP-Based Media Systems & Networking Innovations

› Evolution of Streaming, OTT, and Hybrid Delivery Models

Compression, Processing & Optimization

› Advanced Codecs, Compression Techniques & Standards

› AI-Powered Signal Processing & Media Optimization

› HDR, Color Science, and Imaging Technologies

Hardware, Display & Capture Technologies

› Cutting-Edge Display, Projection & Viewing Systems

› Photonics, Sensors, and Optical Technologies

Industry, Security & Sustainability

› Sustainable Media Production & Green Workflows

› Cybersecurity, Privacy & Infrastructure Resilience

› Standards Development & Industry Evolution

› Cloud Workflows and Their Role in Sustainability

WorldDAB publishes definitive guide to launching DAB+ digital radio

WorldDAB has launched a fully updated second edition of its ebook, “Establishing DAB+ Digital Broadcast Radio”, to coincide with World Radio Day 2026 (13 February). The book is designed as an in-depth guide to the regulatory, technical and commercial aspects of establishing a successful DAB+ digital radio service, as it has claimed in a statement.

The second edition comes three years on from the original publication, and now includes additional expert guidance on:

› New Automatic Safety Alert (ASA), and hybrid radio features.

› Regulation and licensing models.

› Guidance on transmission site design for cost minimisation. There are also updates on country examples, the digital adoption process and analogue switchover.

The book aims to provide guidance to new adopters, as well as offering advice for countries which have already started the process, and those who are nearing permanent service status.

The book covers the complete DAB+ establishment process from initial interest through to analogue switch-off. The topics covered include the major stages in establishment: from initial interest and technical demonstrations through to operations, including ongoing content development and sustained marketing campaigns; and finally, analogue switch-off.

WorldDAB Project Director, Bernie O’Neill, explains: “UNESCO has designated World Radio Day as an official day to thank broadcasters for the news they deliver, the voices they amplify and the stories they share.

Other topics relevant to the Media Tech industry

Previously published, product-specific, commercial, sales, or promotional papers will not be considered for this conference. SMPTE strongly encourages the submission of student papers.

For more information and to submit paper proposals online, visit the web 

WorldDAB is committed to supporting radio’s vital work in these areas. Our updated ebook aims to help broadcasters adopt DAB+ and ensure the medium can thrive in today’s digital world”.

The ebook is available to read and download for free on WorldDAB’s website, along with many other factsheets and expert guides on all aspects of deploying DAB+.

The French-language version of the original ebook: “Mise en place du DAB+ – la radiodiffusion numérique” also remains available. 

NBC expands NBA live coverage with Sportradar’s data-driven tools

Sportradar has announced a multi-year agreement with NBC Sports Regional Sports Networks (RSNs). The objective of this collaboration is to try to enhance the NBA viewing experience through real-time broadcast solutions, as it has claimed in a statement.

NBC Sports Regional Networks will leverage Sportradar’s NBA Advanced Data and GameFrame across live NBA game broadcasts during the 2025-26 and 2026-27 NBA seasons. The agreement

supports hundreds of NBA telecasts across NBC Sports’s regional networks, aiming to deilver more dynamic coverage to fans in multiple NBA markets nationwide.

At the center of the partnership is GameFrame, which uses AI to try to transform live NBA player-tracking data into on-air graphics, animated replays, shot charts, and customized digital assets. GameFrame supports in-game analysis and storytelling by helping on-air

talent explain plays, positioning, and outcomes as the action unfolds.

“This agreement builds upon our long-standing relationship with NBC and reflects how we continue to expand the ways we support their live sports coverage”, explains Brian Josephs, VP, The Americas, Sportradar.

“As NBC continues to evolve how it serves fans across platforms, Sportradar is helping deliver the data-driven tools that bring greater clarity and context to live games, creating more engaging NBA viewing experiences for fans”.

“Enhancing the viewing experience is essential to our NBA coverage across regional sports networks”, adds Jon Slobotkin, SVP, Content & Live Programming, NBC Sports Regional Networks. “Sportradar’s GameFrame offers a new way to add data-driven insights directly into live coverage, bringing visually stunning stories that resonate with today’s fans”. 

Gravity Media provides Australian Open 2026 with broadcast and IP production infrastructure

Gravity Media delivered the Australian Open 2026 with a bespoke international broadcast centre and IP production environment. Together, this formed the technical backbone for Tennis Australia’s creative, editorial and operational delivery

of the tournament to audiences, as the company has claimed in a statement.

75 specialists from Australia, the UK, the Netherlands, Belgium and Germany designed, deployed, and operated the Melbourne Park broadcast

centre. The team managed over 100 tonnes of fly-away infrastructure and more than 150 broadcast cameras across the precinct.

Across Melbourne Park, 172 camera sources were deployed, including:

› Spidercam above centre court.

› Sony HDC 3500 / 5500 cameras, with super slow-motion and ultra-motion camera channels.

› Robotic camera systems.

› 22 roving RF cameras and two remote RF robotic cameras delivering mobile and panoramic views across the venue.

› RF and communications network to try to support operations throughout the site, with 48 Bolero antenna positions enabling 150 belt packs for presenters, production staff, and rightsholders.

The production galleries and audio control rooms were housed at Tennis Australia’s headquarters and was connected via 500 metres of fibre and dual 700-gig data links to the central equipment room at the broadcast compound.

The facilities comprised 10 production galleries, nine audio control rooms, and 11 ViBox systems covering the outside courts. In total, the environment supported over 130 operating positions and more than 150 multiviewers displaying approximately 1,700 picture-in-picture sources.

Signal routing was managed through an IP and baseband system providing a 4,000 x 4,000 routing environment, while over 2,500 audio signals were handled through a hybrid Audio Live and Calrec solution.

Across the 15-day tournament, Gravity Media delivered more than 200 hours of coverage to international and domestic rights holders, including Nine Network in Australia, ESPN, Eurosport, CCTV, and WOWOW.

As part of the 2026 deployment, its objective was to refresh the technical furniture across the production facilities, replacing workstations that had been in service for more than a decade.

Additionally, Gravity Media delivered broadcast and technical facilities for the United Cup in Perth and Sydney, as well as the Brisbane International and Adelaide International.

For the United Cup, it provided outside broadcast production trucks and facilities in each city, accessing 28 cameras, multiple EVS replay and edit suites and Livetools equipment for use by teams and players. Additional bespoke fly-away production systems were implemented in

Sydney so that Tennis Australia could integrate coverage from multiple cities into a “world feed” for delivery across international broadcast and subscription platforms.

The Adelaide International and Brisbane International were supported with outside broadcast trucks, using 22 cameras across each site to deliver bespoke coverage for Tennis Australia and its domestic and international broadcast partners.

Eamonn Curtin, Chief Commercial Officer of Gravity Media shares: “The Australian Open continues to be one of the most complex and rewarding productions in the world of live sport. Delivering this coverage is a true demonstration of the collaboration and expertise of our teams across Australia and Europe. 2026 builds on our legacy of collaboration and we’re proud to support Tennis Australia in bringing this iconic tournament to audiences everywhere”. 

Warner Bros Discovery reports best-ever streaming Olympic Winter Games

The return of the Olympic Winter Games to Europe has driven significant growth in viewership across Warner Bros. Discovery’s (WBD) european services. The streaming featured 1000+ hours on HBO Max and discovery+ and new viewing features such as Olympics Multiview, as WBD has claimed in a statement.

During the seventeen days of competition (6-22 February 2026), as well as two days prior to the Opening Ceremony featuring select sports events (4-5 February 2026), the company highlights include:

Streaming (HBO Max and discovery+)

› The best streaming Olympic Winter Games ever on Warner Bros. Discovery services (vs Beijing 2022 and PyeongChang 2018).

– Triple-digit percentage growth in total hours viewed (+103% vs Beijing 2022) – including triple-digit growth in France, Germany, Italy (on HBO Max) and the UK (discovery+).

– Streaming across HBO Max and discovery+ saw three times more viewers (234%) vs Beijing 2022.

– As reported, total subscribers streaming Milano-Cortina 2026 content exceeded those that viewed the entire Beijing 2022 Games after only 3 days (6-8 February 2026; first 3 days of full competition).

– Social video views on Eurosport (Europe) and TNT Sports (UK & Ireland) accounts exceeded 4 billion.

› Innovative streaming features helped maximise engagement and boost viewership. Available within the ultimate Olympics viewing experience on HBO Max and discovery+, including:

– Presenting every event live – all 246 live sessions, peaking at 11 concurrent events, including 116 medal contests, 2,900 athletes competing across 19 days.

– Approximately a third (32%) of users utilising Olympics Multiview. For the first time

allowing streaming viewers to watch up to four Olympics events concurrently on a single screen with their choice of audio.

– Users could navigate the concurrent live action through Gold Medal Alerts and Timeline Markers; offering Personalised Watch Lists that were utilised by the majority of users; allowing users to select from up to 21 commentary languages for their live coverage.

Commenting on Milano-Cortina 2026, Andrew Georgiou, President & Managing Director, WBD Sports Europe, shares: “The success we witnessed in the opening days has translated into an outstanding Olympic Winter Games for Warner Bros. Discovery with substantial streaming viewership and engagement growth in addition to highly robust linear audiences”.

“Watching the Olympic Games on HBO Max and discovery+ clearly resonated with audiences in the UK and Europe with three times as many people choosing to stream the Games with us compared to Beijing 2022. Viewers being able to curate their own Olympics, selecting from all 116 live events and using innovative features such as our one-screen Multiview, drove significant increases in time spent by fans watching on our streaming services.

This Olympics has set an incredibly strong foundation as we look to Los Angeles 2028 and the Olympic Winter Games returning to Europe again for French Alps 2030”.

Across linear (Eurosport in Europe; TNT Sports in the U.K & Ireland)

› Linear TV audience are measured up to 18/02/2026 across the following territories: Denmark, France, Finland, Germany, Netherlands, Norway, Poland, Romania, Sweden, U.K

› Linear saw a +3% increase in viewers vs Beijing 2022, marking a reverse of declines in total TV consumption over this period and driving the bulk of overall live viewing.

› More than 50% linear growth in hours viewed overall vs Beijing 2022 (+51%).

– Eurosport channels vs Beijing 2022:

• France +47% in hours viewed

• Germany +50% hours viewed

• Poland +32% hours viewed

– TNT Sports channels (U.K & Ireland only) vs Beijing 2022:

• UK & Ireland +60% hours viewed

WBD’s streaming platforms HBO Max (Europe) and discovery+ (UK) were the only place to watch every event broadcast live, whilst its linear channels Eurosport (Europe) and TNT Sports (UK and Ireland) delivered Games-time coverage. 

AIMS announces certification of first 48 IPMX products

The Alliance for IP Media Solutions (AIMS), together with the Video Services Forum (VSF), the Advanced Media Workflow Association (AMWA), and the European Broadcasting Union (EBU), have announced that 48 products were officially certified to the Internet Protocol Media Experience (IPMX) standard. This validation took place at the recent IPMX Product Testing and Certification Event in Geneva, Switzerland, as AIMS has claimed in a statement.

The IPMX-compliant solutions were revealed at a cocktail reception that took place at ISE 2026 in Barcelona, in booth 5K880.

The IPMX Product Testing and Certification Event represented the first opportunity for manufacturers to formally certify products against the IPMX set of open specifications for professional media over IP. Following testing, solutions from Bridge Technologies, Matrox, Adeas / Nextera, Panasonic, Cobalt, intoPIX, plexusAV, Megapixel, Novastar, and Evertz passed certification. These products are now certified

and will carry the IPMX branding, signaling verified compliance with published transport, control, and interoperability requirements.

“We couldn’t be more excited to be unveiling the first IPMX-compliant products at ISE 2026”, shares Sam Recine, IPMX Pro AV Working Group Chair at AIMS. “The certification of these solutions represents a major milestone in IPMX’s transition from specification to a certifiable and deployable technology, validating years of collaborative technical work and confirming that IPMX is ready to move from development into active deployment”.

“AIMS extends its sincere thanks to Packetstorm and Meinberg for their critical technical contributions, test infrastructure, and timing and network expertise that were essential to the success of IPMX interoperability and certification efforts”, adds Recine. “We also recognize the EBU for hosting the event and for providing independent administration of the test process, ensuring a rigorous and neutral certification environment” 

Brutal Güet puts its new OB van into live operation at the Women’s Winter Classic

The Women’s Winter Classics in Gstaad marked the first live outing for OB truck S12, the new mobile production unit operated by Swiss broadcast service provider brutal güet. Produced for RED+, the openair event — a flagship fixture of Switzerland’s top-tier women’s ice hockey league — also served as the inaugural live deployment of the vehicle’s fully IP-based Lawo audio infrastructure. At the core of the audio control room is a Lawo mc²56 MkIII console with 48 faders, paired with a redundant A__UHD Core and an AoIP environment based on SMPTE ST 2110, AES67 and RAVENNA. The production demonstrated how flexible system configurations, efficient

workflows and high audio quality can be reliably implemented in a demanding live environment.

The Winter Classics in Gstaad rank among the premier events in Swiss women’s hockey. Staged in the center of the village, and framed by an alpine backdrop, the matchup saw reigning champions SC Bern face league leaders EV Zug. The production was realized in a demanding setup given the lack of conventional stadium infrastructure on site. For brutal güet, the event provided an ideal opportunity to put the production concept of the new OB van into operation for the first time under live broadcast conditions.

For Christian Maier, Senior Broadcast Audio Engineer at brutal güet, the first production with the new system marked a technological step forward: “Overall, it initially felt quite unspectacular because I was already familiar with the system. At the same time, working with AES and RAVENNA streams introduces new capabilities and advantages that still require a certain degree of adaptation.”

Multilingual operation is a central requirement in Swiss broadcast productions. Although only one language feed was actively used in Gstaad, the audio setup was designed for trilingual operation from the outset. “Productions in

Switzerland are inherently more complex. We always have to be able to deliver three language versions,” explains Maier. “That’s why I structured the setup so that all three languages are prepared — not only for this production, but for all future ones as well.”

That tells me right away where to look — or where not to look.”

BB

The latest mc² generation provides advanced DSP capabilities that streamline daily workflows. “Being able to store and recall EQs, compressors and presets, as well as using dynamic EQ processing, makes multilingual productions much easier. I don’t have to build the same signal chains multiple times.” DSP resources can also be allocated selectively. “Not every output requires full processing. If a signal path is mainly used for information delivery, I can reduce processing there and allocate resources where they deliver the greatest sonic benefit.”

Speed is another critical factor in live broadcast operation — both in day-to-day use and when troubleshooting. “I can customize the console to suit my workflow in every detail, but I don’t have to. I can also pragmatically route a signal from A to B quickly and easily — and it still works. At the same time, I can create an interface that lets me work extremely efficiently”, says Maier. The system’s visual clarity proves particularly valuable when problems arise: “If a camera operator reports they can’t hear anything, I can immediately check a second or third meter to see whether the signal is leaving my system.

The open-air setting introduced additional acoustic challenges for the audio design. With no plexiglass walls behind the goals, the typical ice hockey reflections were absent; also the number of spectators was unpredictable. “It makes a significant difference whether 500 or 5,000 people are standing around the rink. In some cases, spectators may even be just a meter behind my ambience microphones,” notes Maier. To address this, the team implemented a customized microphone setup using directional shotgun microphones, ensuring precise capture of the on-ice action while minimizing unwanted noise.

Maier’s mixing approach focuses on balancing atmosphere and speech intelligibility. “I like to use a strong ambience mix to create an immersive experience, but the commentator always needs a clearly defined space.” This was implemented using dynamic EQ processing, sidechain filtering and automix functions. “The ambience is dynamically shaped in the relevant frequency ranges to leave room for speech without reducing overall level. This is an area where the latest Lawo console generation offers significantly more creative and technical flexibility.”

Precise timing and latency management is another critical aspect of modern broadcast production. Wireless cameras, remote feeds and distributed IP signal paths require flexible delay strategies. “Different signal

paths operate in different time domains, and these have to be aligned again at the output stage. With Lawo, delays can be applied exactly where they make sense — at the input stage, within groups or on direct outputs. This level of clarity and flexibility remains a key advantage, especially in hybrid IP and remote production environments.”

For brutal güet, the S12 premiere represents both a technical milestone and a strategic foundation. The vehicle was built by Broadcast Solutions based on the Streamline concept and designed for a wide range of production scenarios, including sports, cultural events, festivals and live entertainment. “From the outset, our goal was not to focus exclusively on traditional sports productions,” says Maier. “If we produce an opera, we need this surface structure and channel capacity as well. The concept is to use the OB van as a fully equipped mobile control room and minimize on-site setups.”

The successful Winter Classics production validated this approach, showing how Lawo AoIP technology, flexible DSP architectures, and carefully designed workflows enable modern live broadcasts to be executed efficiently with consistently high quality, even under challenging conditions. For brutal güet, the first production with the S12 OB van marks the beginning of a new generation of mobile broadcast operations — scalable, versatile, and fully IP-based. 

Post-ISE special:

What strategic relevance does the show hold today for the broadcast market?

We gathered insights from a selection of broadcast exhibitors, representing different profiles and company sizes, to analyse the role ISE currently plays in their strategies and the main theme s that shaped the conversation at this year’s edition

The latest edition of ISE (February 3–6, 2026, Barcelona) once again broke records in terms of visitors, exhibitors and exhibition space. However, beyond the figures, what was truly significant for the broadcast ecosystem was the confirmation that the sector is no longer understood as an isolated domain, but rather as part of a broader, hybrid and global audiovisual landscape. This was, at least, one of the key conclusions we drew during the event.

TM BROADCAST was present, as every year, walking the stands, corridors and conference rooms and speaking with some of the market’s leading players. One of the underlying questions we reflected on was: what strategic relevance does the show hold today for the broadcast market?

The answer was repeated with nuances, but with a clear consensus: ISE has consolidated its position as a transversal meeting point where integrators, AV installers, manufacturers,

broadcast operators, corporate studios and institutional teams from around the world converge.

“We got the chance to meet with systems integrators and AV installers, as well as broadcast and production professionals from all over the globe. Through these interactions, we’re able to better understand the unique challenges they face and innovate with them in mind”, explained Andy Bellamy, Technical Director at AJA Video Systems.

For Grass Valley, the value lies precisely in this intersection of worlds and in the opportunity to engage with players who are redefining the use of video beyond traditional broadcast: “It brings together corporate studios and public sector teams, along with the integrators who design and support these systems over the long term”, noted Jonathan Lyth, Product Director, Enterprise Media, Grass Valley.

This direct contact also makes it possible to detect how rapidly market expectations are evolving.

“This was our second ISE as an exhibitor following our debut last year, and one of the most notable differences has been how quickly expectations are moving. Teams are treating video as part of their day-to-day operations, with real demands around consistency and secure, repeatable workflows”, added Jonathan Lyth.

From Panasonic, Álvaro Ortiz, Product Marketing Manager, points to a structural transformation of the trade show landscape itself: “ISE has successfully adapted to other traditional sectors that even had their own tradeshows… ISE has managed to incorporate integrators, rental houses, dealers… also opened ‘channels’ that are attractive to audience: if you are a corporative video filmmaker, maybe ISE is a more proper fair than a purely broadcast-oriented fair or a cinema event”.

This transversal reading of the market is echoed by other manufacturers. Sony underlines that the show reflects the expansion of broadcast

into a more interconnected AV ecosystem: “Our customers today operate across cloud, IP, corporate, education, and live event environments, and ISE brings all these worlds together in one place”, stated Patricia Andrés, PR Sony Pro Spain.

From the professional audio field, the perception is equally clear: the diversity of profiles coexisting at ISE reflects the real convergence of applications and environments. “What unites them is the need for reliable, high-quality audio, whether this is theatres, live audio touring, places of worship, field recording, film and streaming, corporate AV or education. At ISE, we can welcome them all: broadcasters, sound engineers, rental

owners, integrators, theatre A1s, you name it. This is what makes this show so special for us”, said Tobias von Allwoerden, Manager Broadcast and Film, Sennheiser.

In the field of critical infrastructure, the conversation shifts towards systemic integration and control architecture.

Jochen Bauer, EVP Sales EMEA at G&D, highlights the strategic dimension of the event: “ISE is valuable because it brings together the full ecosystem that broadcast operations depend on - AV, IP infrastructure, control, security, and systems integration - in one place”.

For global integrators such as Qvest, the opportunity lies in applying broadcast

transformation expertise to AV use cases: “The show is particularly useful because it provides a platform to apply our long-standing broadcast transformation expertise to AV-centric use cases and to design practical architectures, integration models, and service offerings that meet these market requirements”, explained Christian Felder, PR Manager, Qvest.

Along the same lines, Manifold highlights the opportunity to expand its ecosystem into adjacent applications: “ISE offers an opportunity to deepen our engagement with the ProAV market, meet existing technology partners, connect with and train new resellers, and support our ecosystem in expanding

into adjacent application areas such as enterprise, events, studios, theatres and live entertainment”, noted Erling Hedkvist, Broadcast Technology Sales & Business Development at Manifold.

Marshall Electronics, for its part, emphasises the international and relational dimension of the event: “ISE provides a valuable opportunity to connect directly with systems integrators, distributors, and end users from across the global broadcast and AV markets. The scale and international reach of ISE make it a key event for strengthening partnerships and showcasing innovation”, explained Bernie Keach, Broadcast AV Consultant at Marshall Electronics.

Convergence as

the starting point

If the dominant message had to be summarised in a single word, it would be convergence. Sectors traditionally unrelated to broadcast, such as corporate environments, live events or institutional settings, are now demanding production standards comparable to those of television. Multi-camera streaming, cinematic look, remote workflows… What was once exclusive to broadcast has become accessible to other verticals.

Christian Felder (Qvest) identifies the convergence of broadcast, AV and IT as the dominant theme of this year’s edition: “The main

topic was the continued convergence of broadcast, AV and IT. Broadcast and AV domains are increasingly relying on the same underlying technologies and architectures. This development is driven by the growth of live streaming, rising expectations for video and audio quality, and the widespread adoption of IP-based workflows. As a result, the boundaries between traditional broadcast production and Pro AV applications are steadily dissolving, fundamentally changing how content is created, managed and delivered”.

From ROE Visual, its Director of Virtual Production, XR and Broadcast, EMEA,

Olaf Sperwer, summarised it as follows: “Broadcast is no longer confined to traditional TV studios. It is converging with corporate communications, hybrid events, and in-house content production facilities. ISE provides a platform where this convergence becomes visible”.

Convergence is not only technological; it is also cultural and organisational. Jonathan Lyth (Grass Valley) emphasised that this is no longer an emerging trend, but the new starting point for the market: “For me, the biggest theme at ISE 2026 was that convergence is the default now. Many teams we spoke with are producing content far more regularly than they

were a few years ago, often with small crews, and they want setups they can rely on without constant reinvention. Longevity was another key takeaway from conversations. In corporate and government environments, installations are expected to last, so decisions are shaped by

governance expectations and how systems integrate into the wider infrastructure. That’s why software-defined approaches attracted so much attention, because they give organisations more control over how workflows evolve as requirements change”.

WHAT MAKES ISE ATTRACTIVE

“ISE has successfully adapted to other traditional sectors that even had their own tradeshows”, Álvaro Ortiz (Panasonic)

“ISE is valuable because it brings together the full ecosystem that broadcast operations depend on - AV, IP infrastructure, control, security, and systems integration - in one place”, Jochen Bauer (G&D)

“ISE offers an opportunity to deepen our engagement with the ProAV market, meet existing technology partners, connect with and train new resellers, and support our ecosystem in expanding into adjacent application areas”, Erling Hedkvist (Manifold)

“This was our second ISE as an exhibitor following our debut last year, and one of the most notable differences has been how quickly expectations are moving”, Jonathan Lyth (Grass Valley)

The sustained growth of streaming acts as an accelerator of this process. Quality expectations are no longer limited to linear broadcast, but extend to any environment where video forms part of day-today operations: “Demand for streaming gear across markets is surging, making streaming and remote production tools a continued focus for us. But it’s no longer just about high-quality, low latency streaming productions

that can operate across geographies; audiences expect streams to look cinematic”, stated Andy Bellamy (AJA).

This aesthetic ambition also extends to the live events sector, where the boundary between television production and stage production is becoming increasingly blurred: “For venues and live event teams, the ability to introduce a more cinematic look through a Super

35 global shutter camera, while still working within familiar live production workflows, was another point of discussion”, explained Jonathan Lyth (Grass Valley).

IT logic as a common language

The common ground where broadcast, ProAV and IT converge is infrastructure. IP, software-defined architectures, cloud and virtualisation are no longer distinct layers, but rather the shared foundation upon which new production models are being built.

In this context, the conversation at ISE no longer revolves around whether to adopt IP, but how to do so efficiently and sustainably. Andy Bellamy (AJA) summarised it as follows: “As broadcast and proAV continue to converge, we’re taking note of several technologies that have captured the attention of both industries. One thing was clear at ISE: more facilities, productions, and installs are actively investing in IP technology, but need intuitive tools that

let them bridge between baseband and all the different IP video standards, protocols, and formats”.

The issue, therefore, is not only adoption, but transition. The coexistence of baseband and IP remains a reality in many environments, making it necessary to develop bridging tools that enable this coexistence without increasing operational complexity.

“I see broader adoption of IP and SMPTE ST 2110 in particular, driving new scalability and efficiency across broadcast and AV environments while ensuring premium quality video experiences for audiences. We’re going to see new tools emerge that solve IP workflow gaps, and more education will become available industry wide. With more implementations, the knowledge base and talent pool will grow, and facilities and workflows will mature. In the near term, many of these pipelines will leverage bridge technology to make moving between baseband

and IP more intuitive”, added Andy Bellamy.

At the same time, security is consolidating its position as a decisive factor in the design of architectures and workflows: “We also spent time discussing the security side of the architecture, because many enterprise and government environments want the flexibility of software workflows while still keeping media processing and content inside their own network”, noted Jonathan Lyth (Grass Valley).

More intuitive solutions

The third axis identified at ISE relates to usability. The market is no longer composed solely of highly specialised broadcast engineers. New profiles — corporate, institutional, small production teams — are working with increasingly sophisticated technologies and demand tools that reduce friction without sacrificing performance.

In this context, technical complexity tends to move

“behind the screen”. Álvaro Ortiz (Panasonic) illustrated this with two specific developments: “I would dare to say that the main one [distinctive technology] is Advanced Auto Framing: the complexity that it involves behind the screen, but transformed in a very friendly, easy-to-use solution when you are in front of it, makes the multicamera production outstanding even if the production staff is shortly manned or with not specialist operators”.

He added: “The second one could be the autofocus system in a 2/3” lens: it has been long time since we have been using autofocus with other kind of lenses, especially with those ones coming from still photography or video production with large format sensor, but, paradoxically, in a standard like the 2/3” lenses, the mostly used in broadcast, we have been surviving assuming ‘everything was focused’, independently of the nature of the glass and the sensor, and it wasn’t necessarily true”.

KEY THEMES AND INDUSTRY TRENDS

“The main topic was the continued convergence of broadcast, AV and IT”, Christian Felder (Qvest)

“The dominant theme at ISE 2026 is undoubtedly the acceleration of AI driven workflows across the entire professional AV ecosystem”, Patricia Andrés (Sony)

“Broadcast is no longer confined to traditional TV studios. It is converging with corporate communications, hybrid events, and in-house content production facilities”, Olaf Sperwer (ROE Visual)

“Demand for streaming gear across markets is surging, making streaming and remote production tools a continued focus for us”, Andy Bellamy (AJA)

“We also see the importance of brand-agnostic tools grow, and generally a new level of userfriendliness being achieved in the industry”, Tobias von Allwoerden (Sennheiser)

“Rather than focusing on a single ‘best’ camera, the key conversation this year is about which camera performs the best depending on the application”, Bernie Keach (Marshall Electronics)

In audio, the evolution points in the same direction: “From a pro audio perspective, the No. 1 topic is certainly wideband wireless audio. The continued interest this topic has met with at ISE reflects its importance for professional audio applications and workflows. We also see the importance of brand-agnostic tools grow, and generally a new level of user-friendliness being achieved in the industry”, explained Tobias von Allwoerden (Sennheiser).

At the same time, artificial intelligence is emerging as a structural layer that enables this operational simplification. Sony underlined this point: “The dominant theme at ISE 2026 is undoubtedly the acceleration of AI driven workflows across the entire professional AV ecosystem. AI is no longer an add on or a future concept; it is becoming a foundational layer that enhances image quality, automates operations, and enables more efficient, scalable production environments”, stated Patricia Andrés.

ROE, however, introduced a relevant nuance: “AI will certainly dominate discussions, particularly in content generation, virtual production, and workflow automation. However, beyond AI, the fundamentals of studio production remain critical. Broadcast environments still require display technologies that meet strict standards in color accuracy, contrast performance, surface reflectivity, and mechanical stability”, added Olaf Sperwer.

This pursuit of greater intuitiveness also extends to the heart of critical infrastructures: the control room. From G&D, Jochen Bauer describes a modernisation centred on the operator, where interface coherence and system predictability

are as important as technical power:

“In our field, the most significant trend is the push toward operatorcentric control room modernization: improving usability and speed without compromising security or reliability. We see strong attention on secure, software-defined control that still behaves predictably under pressure -especially when multiple systems and stakeholders converge in one room. Another recurring theme is simplifying operation across heterogeneous sources (IT, security, broadcast, OT) while keeping the operator interface consistent, so training effort and human error are reduced. Finally, there’s growing interest in integrated, desk-level interfaces that reduce peripheral clutter and

make complex actions repeatable - because in critical environments, workflow clarity is often the difference between quick response and confusion”.

Finally, this adaptation to new profiles and environments also translates into a more contextual approach to equipment. Marshall Electronics summarised it from the camera perspective: “Rather than focusing on a single ‘best’ camera, the key conversation this year is about which camera performs the best depending on the application. No two environments are exactly alike, and customers are prioritizing factors such as sensor size, lens and zoom options, form factor and integration flexibility”, stated Bernie Keach. 

C2PA:

When

credibility is morethana value— is it possibletocertify truthfulness?

In the face of the flood of AV content created with AI — and not always with good intentions — is there any formula to distinguish what is authentic from what is not?

To the same extent that artificial intelligence tools have been massively deployed, and their results in the audiovisual field have improved at a dizzying pace, the creation of false news or “fake news” for all kinds of purposes and intentions has also spiraled out of control.

This very concept, which was never a problem in cinema because we always knew that our blue-caped superhero did not really fly (sorry to reveal the secret if anyone did not know…), now represents an extremely complex challenge in the world of journalism and news.

Although the trained eye is still capable of distinguishing at first glance between real content and “artificially created” content, a significant portion of the audience can no longer tell the difference. And at the current pace of development, we predict that in a very short time — a few years at best — it will be practically impossible to distinguish, even for the most experienced eyes.

The first issue we face, being rigorous and as mere spectators, is the paradigm shift this change of scenario represents. We come from a world in which an image was a document in itself and was worth more than a thousand words, only to plunge abruptly into the confusion of not being able to trust, a priori, any of the content before our eyes.

And this brings to the surface what we believe to be the next major challenge for all broadcasters: how to maintain editorial prestige and preserve audience trust. Let us not lose sight of the fact that a single lie can call into question the credibility earned over many years of serious and committed work.

For this reason, we believe that the next major challenge no longer lies in mere technological evolution. It is no longer about HD, UHD, HFR, BT2020, IP… but about ensuring that the information we deliver to our audience is sufficiently trustworthy to maintain our position as credible informers. This challenge is far from easy, nor is it something that can be developed as an individual initiative. Fortunately, and with this objective in mind, the C2PA consortium was created.

The acronym stands for Coalition for Content Provenance and Authenticity. Its purpose is to develop an open

technical standard that makes it possible to create and manage content credentials that allow the origin to be identified and all modifications undergone by digital audiovisual content to be tracked.

The overlay that we will likely begin to see soon — and increasingly frequently — in validated content will be “content credentials” or “cr” in its abbreviated version.

It is a technology very similar to what has been available in still photography for some time, but in this case applied to content of much greater length and complexity, such as video and audio. And making it open

facilitates the participation of the greatest possible number of stakeholders across all communication and distribution sectors, although it also means that certain developments and implementations require more time for agreements and consensus.

The next major challenge no longer lies in mere technological evolution, but in ensuring that the information we deliver to our audience is sufficiently trustworthy

Although it is not yet fully developed and finalized, we are already in a sufficiently advanced phase of development and demonstration to have it accessible and to assess the change it may represent, both in creative processes and especially in those related to news production. The path has been laid out.

Conceptually it seems simple: it consists of accrediting the origin by stamping a digital signature or certificate in the non-editable metadata

of the content during its creation, and progressively adding signatures or certificates that identify what changes have been made to that content until it reaches us.

And this, which stated this way sounds very simple, is extremely complex to put into practice. It must cover the entire creative process: from content capture to viewing by the final audience. Passing along the way through all the editing processes it may have undergone in intermediate stages. And determining the type and amount of information to be included.

At this point we see a very specific difficulty beyond mere shooting, such as the legitimacy of computer-generated content inserted into our program — illustrations or graphics, for example. In other words, we must be able to identify and show the viewer the provenance and manipulations practically at frame level.

And all this is intended to be achieved through consensus among all

stakeholders, who have their own product ranges, services, commercial objectives and technical and business development projects. In this regard, it is worth mentioning and appreciating that virtually all major players are already involved in this initiative.

The purpose of the C2PA consortium is to develop an open technical standard that makes it possible to create and manage content credentials that allow the origin to be identified and all modifications undergone by digital audiovisual content to be tracked

In this sense, we invite you to look closely at the consortium’s website (links at the end) to see what types of companies are highlighted as the 11 steering members, participating with greater involvement and in some way leading the more than 500 already affiliated at the time of writing. We will only offer an interesting

clue: there are more service companies than product companies.

Undoubtedly, the project is far more ambitious and complex than it may appear at first glance. Let us therefore move forward by examining the achievements reached, the current situation and the foreseeable future.

To simplify the explanation, we will turn to something we are all already very familiar with: our personal certificates in our Internet browsers. These are certificates issued by a Certification Authority that guarantees their origin and security and that we currently use, for example, to identify ourselves before official bodies or to sign documents.

What is important is that these certificates are not sent when we sign a document with them; rather, the legitimacy of that signature is recorded and can later be verified through various means, such as certification and signature verification platforms. Making it

a convenient and secure procedure.

Something similar is what the C2PA coalition seeks to achieve: that there may be all kinds of devices — cameras, editors, distribution channels, players, etc. — capable of supporting and progressing these certificates of provenance and content processing, maintaining that information embedded and immovable within our audiovisual files. The ultimate purpose is that this information can later be verified at any stage of the production chain and even at final viewing.

Just as these certificates are secure in our browsers, they must be equally secure in our equipment, issued by a Certification Authority that makes them reliable and operational across all compatible devices, according to a standard that is already defined and whose functionality has been verified.

But that is not all: in addition to being issued by accredited entities, they must be reflected in every

piece of content and every file recorded with that camera and edited with that editor. We will return to this point shortly when discussing the camera and the “real world.”

Returning to our production processes, let us see what happens once the file leaves the camera with the certificate discreetly stamped as a collection of metadata that must coexist with our content and remain unaltered in subsequent processes. For example, once those materials enter our editing process, our certificate must be maintained, and the editing software must recognize and progress it, adding information about the modifications that content has undergone along with its corresponding editor certificate.

The same must occur at every single stage in which the material is modified in any way, until it reaches the final viewer. That is, the broadcaster, as the final party responsible for the content, may — or must — add its own certification.

Of course, we are now all thinking about what kind of image we will see if what appears before us is this enormous amount

of overlaid information. Let us remain calm, because initially we will see practically nothing. The idea is that this information

EXAMPLE OF BBC USAGE | YOUTUBE
EXAMPLE OF USING CONTENT CREDENTIALS | YOUTUBE

may be displayed or not at the viewer’s discretion. In other words, the image will remain clean and will only show a small “cr” logo when validation metadata exists, at which point we may choose whether or not to view the details of those certifications.

We insist that all this has a much broader scope than it may appear at first glance, since it would be very easy for any editor or broadcaster to create an overlay imitating this certification. That is why it is important that content merely indicates whether it contains this information, and that it be the viewer who can activate or deactivate it at will to

guarantee the credibility of the procedure. And that all the media involved, such as playback applications, manage it properly.

We are already in a sufficiently advanced phase of development and demonstration to have it accessible and to assess the change it may represent

Although technically the solutions are relatively complex — with the added challenge of achieving agreement among all companies across all sectors beyond their commercial programs — the final objective is that

for the user it should be as simple as activating or deactivating subtitles in any player, whether online, local, OTT, etc.

And this requires multiple contributions at all levels. But it is not a utopia. In fact, at the last IBC (Amsterdam, September 2025) it could already be seen in action at different stands. It is true that there are currently very few resources capable of supporting this protocol and these licenses, but what seems truly important and significant to us is that the first ones already exist and they are not prototypes, but commercially available equipment and applications.

Regarding cameras, it seems that for now (verified as of September 2025) there is only one: the new Sony PXW-Z300, which already supports it and has been used in all demonstrations, both inside and outside the IBC exhibition. As for software, the latest version of Adobe Premiere Pro also supports it. Both were demonstrated and driven by the first tests carried out by the British broadcaster BBC.

It is public knowledge that many companies are doing what is necessary for their products and solutions to participate in this initiative. In fact, many are already part of the consortium, at different stages of progress.

We dare to predict that it will only be a matter of time before everyone — at least all those involved in news production — has products, services and solutions compatible with this standard.

However, it should also be noted that we are still at a fairly early stage. For example, at this moment the certificate is only embedded and managed in proxy files, but not yet in full-quality MXF files. The reason appears to be related to the need for more elaborate agreements among more participants in order to maintain all the compatibilities that the different formats currently have.

Going even deeper into the details, and following the sequence of a production, let us see what the camera does. First, we must install a certificate that will be linked to the author or owner. Because even at this stage there are different possible actors: a camera may contain the certificate of its owner if it belongs to a freelancer,

or the certificate of a production company if it is equipment handled by several operators.

So, is the certificate stamped on all content? Absolutely not. And it is not simply a matter of activating or deactivating it. The camera must use its processing and AI functions to recognize “artificially flat” scenes — such as shooting from a screen — to prevent incorrect stamping and thus help avoid confusion between “the truth of the real world” and “artificial worlds.”

Although we do not have all the technical details of what the discrimination threshold would be, it seems that in such cases the certificate would not be stamped, or would be stamped with indications related to this particularity.

We dare to predict that it will only be a matter of time before everyone — at least those involved in news production — has products, services and solutions compatible with this standard

For the next step, we will need an editing system with its own certificate. As with the camera, in September there was only one software solution available: Premiere Pro. What will its role be? Presumably, to maintain validation in those shots where it should be maintained — for example, cropping from a longer shot — adding its own certification when recording the action performed. If we stop at this point to count the number of things that can be done in editing, we would never finish listing examples of “validable” and “non-validable” actions.

As an example, we leave just one question to stimulate debate: is color correction an alteration of reality?

The fact is that after the editing process we will have new content that will contain, at least, the certification of the camera “owner” and those of the different possible “responsible parties” for the edits, which will differ for each fragment (even at frame level) of the finished content. Since that content

will be a compilation of images from multiple sources, and not all of them necessarily come from a camera, nor are they necessarily “forgeries” when they do not come from a camera.

And this is where the broadcaster comes into play, who, as planned, may also have its own certificate to stamp — added to all previous ones — with which it shows and records its actions, since depending on the type of content it will have made its own elaborations. Without losing sight of the fact that throughout the entire process we are not only talking about images, but also all the audio associated with them.

To complete the scenario, the broadcaster could display key parts of the veracity or recreated image information as an overlay in the broadcast. In the research and development section of the BBC website there is an interesting report containing a couple of sample videos. We strongly recommend watching the first one,

just under three minutes long, paying special attention from minute 1:40 onward. You will also find the link at the end.

Despite the vast number of positive contributions that artificial intelligence brings, what is truly disturbing today is its impact on content credibility

With everything we have seen so far — which barely represents a brief outline of what lies behind the initials “C2PA” and “cr” — it is already possible to grasp the magnitude of the initiative they represent, the enormous scope of this project, and the impact it will have in the short term on the way we capture and disseminate credible content.

Because despite the vast number of positive contributions that artificial intelligence offers, what is truly disturbing today is its impact on content credibility. A very powerful tool can be used for multiple purposes. And not being able to distinguish which side of the delicate

line separating information from disinformation — and consequently manipulation — we are on would leave us in a worrying situation of vulnerability if we did not have initiatives such as this one.

Finally, so as not to end on a tragic note, we leave you with a new question: will C2PA be the predecessor of some kind of “protocol droid” designed to interact with humans?

After this small joke, here are the links mentioned so that you can explore this initiative first-hand:

C2PA Consortium: https://c2pa.org

“Content credentials” watermark or “cr”: https:// contentcredentials.org

BBC report mentioned: https://www.bbc.com/rd/ articles/2025-09-newscontent-verificationcredentials-trust 

EamonnCurtin,

CCO of Gravity Media:

“We need to look beyond traditional markets because of new players entering the space”

We invited the company’s executive to Industry Voices to share key insights into the new strategic phase of Gravity Media, shaped by its merger with EMG

Technological progress has made it possible, on the one hand, to cover a broader range of events and, at the same time, to produce competitions that were traditionally delivered under conventional models using more efficient approaches. Gravity Media is fully aware of this shift, particularly following its merger with EMG.

While maintaining its strategic focus on major events, the company recognises that the pace of technological change also requires it to pay attention to competitions that, for budgetary reasons, demand alternative production models.

“You need to operate in both worlds. Major events will still require traditional infrastructure. But many

clients are moving towards software-defined production. It’s about aligning with each client’s journey”, explains Eamonn Curtin, CCO of Gravity Media, whom we invited to the Industry Voices section to provide an overview of the phase the company is currently undergoing.

“We don’t want to be seen simply as a facilities company providing OB or flyaway services. We have a much broader range of capabilities and we’re also exploring markets we wouldn’t traditionally have entered before”, he adds, pointing, for example, to the creative agency space. In this effort to engage with clients and convey a clear message about the true scope of its offering, the company is focusing much of its activity. 

In this conversation, we review some of its most emblematic projects of 2025, both top-tier and mid-sized — and beyond sport — and we also ask for his perspective on the evolution of the market. Among the major changes is softwaredefined production, a transformation with which our readers will be very familiar, as it has been widely referenced by previous Industry Voices guests: “It changes the investment model. We were traditionally CapEx-driven — trucks, cameras, routers. Now there’s a shift toward licence-based, server-based models — more OpEx”.

The interview follows below:

How do you assess the recent evolution and the current position of Gravity Media?

Since the Gravity Media and EMG merger in January 2024, we have become a global powerhouse. We’ve strengthened our global position, not only through the merger itself but also through key partnerships.

We’ve been able to align and deliver a truly global service in a consistent way.

Regarding your merger with EMG, which is, of course, one of your main recent milestones, what does this integration represent for Gravity Media?

Strategically, it solidified our presence within an ever-changing market. It broadened our portfolio of services and strengthened what we look at as two verticals: media services and facilities, and production and content.

The merger enabled us to expand our production centre capabilities globally. We now have two in Australia, one in London, one in Holland, one in Belgium and one in Italy. That supports the drive towards remote production and enables our clients to have greater flexibility based on the portfolio of services we can now deliver.

And is the integration process now fully completed, or not yet?

We’ve now moved to one global brand. We are known as Gravity Media. There is still work in progress in terms of rebranding some areas in certain business units. For example, most recently in Australia, we have started rebranding all the OB vehicles and fleet with the new Gravity Media logo. That process is being executed across the business, so very soon everything will simply be known and branded as Gravity Media.

“The main objective is to ensure that all our clients are fully aware of the complete suite of services we can deliver”

What would you say are your main objectives for the coming years?

The main objective is to ensure that all our clients are fully aware of the complete suite of services we can deliver. It’s important that the message is clear, because our offering is quite broad.

We’ve had to diversify, given how the market continues to evolve.

We need to look beyond traditional markets because of new players entering the space — streamers, digital platforms, federations seeking one-stop solutions. It’s about making sure the message is clear.

We don’t want to be seen simply as a facilities company providing OB or flyaway services. We have a much broader range of

capabilities and we’re also exploring markets we wouldn’t traditionally have entered before.

And regarding your clients, what are their main needs? What are they looking for when they approach you — cost efficiency, flexibility…?

It’s all of the above. Every client is different, but we need to ensure both they and we operate sustainably. They still need to deliver

content, sometimes in new or unique ways, and we see ourselves as a true partner — enabling them to choose the solutions that best suit their needs.

That can mean everything from remote production to AI, to full turnkey delivery — production, graphics, specialty cameras, RF, everything. We did that on some major projects last year, where we delivered the entire production chain, and it was very successful.

KINGS LEAGUE
“We need to look beyond traditional markets because of new players entering the space — streamers, digital platforms, federations seeking one-stop solutions”
Could you mention one of these projects that really reflects your current strategy or highlights your capabilities?

There are several examples. In 2025, we delivered the Women’s Euros. We supplied the production teams, graphics, OB trucks, RF, specialty cameras — everything — a true one-stop shop. That was very successful.

We did something similar in Australia for the Tour Down Under [the country’s premier cycling race]. Traditionally, many of our clients are production companies, but in some cases, we’ve delivered directly for sports federations internationally.

Following the merger, we’ve continued to deliver worldwide events. Last year we covered the World Athletics in Tokyo. We also opened an office in Riyadh and delivered the Six Kings Slam. That’s a good example of new streamers entering the live sports space.

But we’re not just about sport. We delivered the World Economic Forum in Davos, and what was unique there was that it was produced remotely from Vilvoorde in Belgium due to security restrictions.

WOMEN'S EUROS

All switching was done remotely. That approach is now being adopted more widely in similar events.

We also have Gravity House in London, where films such as ‘Hamnet’ were edited. And we are entering the creative agency space. Recently, we announced a strategic partnership with a creative company called Green Couch to develop light entertainment formats for broadcasters. So, it’s about stepping outside our traditional lanes and exploring additional growth areas.

Beyond top-tier events, do you see increasing demand for sports broadcasting in mid-sized leagues?

Absolutely. For example, we’ve delivered the UK Basketball League and several leagues across Italy, Belgium and the Netherlands. We’ve also worked with the Kings League. In Italy, we built the arena next to our Milan headquarters and deliver the full production. That’s a completely new format.

There’s growth in new leagues, in influencerdriven competitions, and in e-sports. And women’s sport has seen enormous growth. We deliver more Women’s Champions League football and recently covered the Women’s Rugby World Cup. We also produce the Women’s Super League remotely for a client whose coverage airs on the BBC.

“We delivered the World Economic Forum in Davos, and what was unique there was that it was produced remotely from Vilvoorde in Belgium due to security restrictions”

The volume is increasing, but the key is how you deliver it. These competitions don’t always have Tier 1 budgets. That’s where simplified production models come in — cloud, on-premise solutions like Simply Live or Live OS.

Gravity Media has traditionally been strong

in flyaway and remote centres. For example, here at our Production Centre in London, White City we produce ATP Tennis and Formula E remotely. Major courts are done traditionally on site, while others are done remotely. Lower-tier tournaments are handled via simplified solutions. It’s about offering the full suite of services.

I’d also like to get your perspective on the market. What do you see as the most critical challenges facing the industry?

The speed of technological change — particularly around software-defined production. It changes the investment model. We were traditionally CapEx-driven — trucks, cameras, routers. Now there’s a shift toward licence-based, server-based models — more OpEx.

You need to operate in both worlds. Major events will still require traditional infrastructure. But many clients are moving towards software-defined production. So, it’s about aligning with each client’s journey.

All clients are different. You need to provide choice — like in a supermarket.

Taking into account all the topics we have discussed, what would you say sets Gravity Media apart from other companies in the global market?

What truly sets us apart in the global market is our people. Across regions, and markets, we are united under one brand identity with a shared commitment to excellence. Our teams are deeply customer-centric, consistently placing client needs at the heart of every decision we make.

We combine that focus with flexibility, adapting quickly to changing market dynamics and tailoring solutions to meet unique customer requirements. This agility, paired with our drive to deliver best-in-class services, ensures we not only meet expectations but exceed them.

It’s this powerful combination, our people, our unity, our customerfirst mindset, and our commitment to excellence,

that differentiates us on the global stage.

Offering clients real choice, whether creatively or financially. We continue to invest in new technologies like 5G for RF cost efficiencies. For example, we conducted private 5G tests at Roland Garros and we’re exploring 5G solutions for smaller cycling events that can’t afford traditional aerial RF setups.

“We are entering the creative agency space. It’s about stepping outside our traditional lanes and exploring additional growth areas”

We also have our own in-house RF development team. We use proprietary tools like Live Tools, used on the Tour de France, and we’re integrating 5G into that roadmap. It’s about diversification and flexibility, while maintaining strength in media services and facilities, and continuing to serve major global clients.

 UK BASKETBALL LEAGUE
 SIX KINGS SLAM

Audiovisual production on wheels:

OB units broaden their scope

Production mobile units are increasingly present in new production environments beyond traditional sporting events: from television series to corporate presentations or concerts. This article offers a look at their historical evolution and analyzes the technical parameters that determine their quality, reliability, and operational performance

Audiovisual production has greatly benefited from the possibility of carrying out multi-camera productions outside fixed facilities such as TV studios. In English, this is known as OB (Outside Broadcasting). It is not only about taking technical

equipment outside these facilities — that is, simply transporting the equipment in a truck (as if it were a move) — but about being able to rely on an effective, convenient, professional, and high-quality solution for both production and transmission. 

In many cases, the sets of a TV series have been built in industrial warehouses where the audiovisual infrastructure had to be resolved by resorting to a mobile unit

These circumstances have led “to search for and find” content, wherever it may be, in order to fill more hours with audiovisual material within the programming of the different windows present in today’s market (TV, social media, live streams…). In Englishspeaking environments, the term EFP (Electronic Field Production) is used generically to designate any type of production carried out outside the studio or stage, which includes the work performed with mobile units.

Many of our readers will surely associate mobile units with sporting events (national and international, such as a Football World Cup or the Olympic Games), but they are increasingly present in other production environments: TV series, galas and tributes, commercial, institutional

and social presentations, in the field of music and live concerts, and across a wide range of event types. This is because technology and manufacturers have made the mobile unit offering more cost-efficient, adapting both to different types of content and to the variety of budget levels.

In recent years, we have witnessed a progressive increase in the variety of situations in which the presence of a mobile unit has been required and, consequently, in the number of companies specifically dedicated to providing this service.

Television fiction production, for example, has not always been located in suitable facilities.

In many cases, the sets of a series have been built in industrial warehouses where the audiovisual infrastructure had to be resolved by resorting to a mobile unit.

The success of the expansion in the use of a mobile unit lies in being a solution in which a production mode prevails, with specific production resources that allow the capture of a “live” output (whether true live broadcast — transmission — false live or delayed) and in working with different video sources (multi-camera) at the location where the content takes place.

Luis Sanz, a Telecommunications Engineer with more than 50 years of experience in the design, planning, management and operation of Spain’s TVE production centers and a regular contributor to TM BROADCAST, describes the origins of mobile units as follows:

“The first television mobile unit of which there is record was developed in Germany in the early 1930s and was used for the first time at the inauguration of the Berlin television center of the Reichs Rundfunk.”

In an article published in TM BROADCAST, the same author continues:

“In England, the first unit was built on two Regal trucks and with a Marconi’s Wireless Telegraph transmission system, which was used for the first time at the coronation ceremony of George VI on May 12, 1937. In both systems, as in others of the time, the recording of transmitted images was carried out on film, since videotape recorders did not exist. It was from 1950 onwards that mobile units equipped with a videotape recorder for recording were developed.”

In Spain, in 1977, the first color mobile unit was inaugurated in Barcelona for the public broadcaster RTVE, equipped with four cameras and a 24-channel audio console.

All of us who work in the audiovisual sector, when we have been asked to “cover an event,” and have thought about setting up three cameras, with live sound, with the possibility of including captions and graphics… and an endless number of technical resources in favor of the content — whether for a theatre play, a friend’s concert, a daughter’s Sunday match, a friends’ wedding or a promotional event… After renting the equipment separately (paying attention to the smallest detail), loading it into a van, assembling and dismantling everything, and encountering the difficulties of something failing or not working, we have always reached

the same conclusion: everything would be much easier if this equipment were perfectly installed and ready to be used.

The first television mobile unit of which there is record was developed in Germany in the early 1930s and was used for the first time at the inauguration of the Berlin television center of the Reichs Rundfunk

That, ultimately, is a Mobile Unit (MU): audiovisual equipment integrated into a transport vehicle prepared to provide an audiovisual solution when covering an

event or content that takes place outside the facilities of a television broadcaster, using multi-camera production techniques. There are mobile units specialized in production (PMU)/contribution; and others dedicated to transmission/link (TMU) or communication.

In England, the first unit was built on two Regal trucks and with a Marconi’s Wireless Telegraph transmission system, which was used for the first time at the coronation ceremony of George VI on May 12, 1937

In this article we focus on production mobile units (PMU). Depending on the country of origin, they receive different names: mobile production control room (PCR); production truck; scanner (a BBC term); mobile unit (MU); remote truck; live truck; OB van / OB truck; or live eye.

All these denominations share one main characteristic: mobility. This is obvious, since having a dedicated transport vehicle allows access to a multitude of places and locations. Their flexibility is also noteworthy. The ability to adapt to different working situations, types of content, audiovisual market regulations and/ or client requirements. We are referring to the decision-making process carried out from the outset, which encompasses the construction and design of the mobile unit together with its technical configuration and equipment. This even

allows us to request a “made-to-measure” mobile unit (customized solutions), enabling decisions ranging from the type, model and size of the vehicle to technical details, access points, finishes and the technology to be used.

The mobile unit market already offers a catalog of finished mobile unit types in order to facilitate selection (purchase/rental), adapting to different budgets and to more standardized technical and creative needs when covering an event.

Finally, quality control and reliability are two differentiating elements

compared to any solution considered for multi-camera production in outdoor environments. From a production standpoint, a mobile unit reaches broadcast category status by complying with professional audiovisual sector standards (SMPTE 292M and 259M) and fully meeting occupational risk prevention requirements that may arise in this working environment (access, thermal and acoustic insulation, electrical protection, load and overweight control, driving, etc.).

Both PMUs and TMUs evolve at the pace of the technological

transformation experienced by the television and live events sector itself. Therefore, in terms of image, we find workflows based on SD (increasingly less common), FHD, UHD, 4K and even already 8K. Of course, adapted to HDR content production. And in reference to sound, with stereo solutions, Dolby encoding, 5.1 sound… Mobile units offer both cable-based connectivity (BNC-SDI, IP, fiber optic) and fully wireless solutions.

Since each PMU is a unique, adapted and integrative solution combining different audiovisual technical systems and equipment, this article recommends becoming generally familiar with some of the spaces and equipment to be taken into account:

› Physical dimensions of the vehicle body (length, height and width excluding the driver’s cab). This is known as the chassis or body. There are major vehicle manufacturers such as Iveco, Mercedes, MAN, Volvo, Scania or Renault in the case of large chassis.

Dimensional data are important to understand the space occupied in common situations such as parking or passing through different areas (streets, squares, roads, paths, bridges…).

And, in relation to the interior, they are relevant for knowing the capacity for technical/ artistic personnel and achieving an adapted and comfortable working environment.

These dimensions, specifically the width, may vary if the mobile unit features a retractable side pod or expandable area (single, double or triple). It is also important to know whether the roof of the body is accessible for possible use — that is, whether it can support additional weight.

› Work areas or spaces: each designed with technical ceiling/floor systems and panels for the proper connection and distribution of cabling. The most notable areas include: production control (the largest space), audio control, camera

control, VTR and EVS area (usually located in the expandable section of the mobile units), production room, technical control area, mini TV studio + chroma (also placed in the retractable side section of the body), among others.

› Storage compartments: located either at the rear or perimetrically within the vehicle body, allowing storage of cable reels, complementary material and camera accessories.

› Power/consumption: complete electrical installation, including an uninterruptible power supply system, main service and backup — UPS and backup systems — for example: 380/400 Volts + N + T with UPS, dual input, connection type: Cetac CEE 125A with threephase 9 kW generator.

Any production mobile unit that aims to be competitive in the audiovisual sector must offer Broadcast / Events equipment across the different work areas it may include:

› Camera equipment: this refers to the number of EFP, compact, POV, robotic and/or PTZ cameras; different lenses; camera cabling runs (triax, IP and/or fiber optic); tripods or pedestals…

› Production control equipment: video mixer/ switcher, number of VTRs/Decks, number of EVS servers, character generator/graphics system, monitor wall…

› Audio equipment: audio console, microphones (handheld, wireless, headset, lavalier), monitors, returns, encoders-decoders, various players, telephone hybrid, communications system 4H link 6 lines, ISDN codec…

› Quality control equipment: video matrix, monitoring, CCU, patch panel, MFO, vectorscope, signal converters…

› Communication equipment: transmitters/ receivers and intercom matrix, D-Link network, wireless router…

Production mobile units are often accompanied by communication/ transmission mobile units, where modulation, antenna, and amplification equipment for the corresponding connections/repeaters are installed

› TV studio + Chroma equipment: chroma background, LED lighting, PTZ cameras…

It is very important to know the type and model of vehicle (brand, engine, speed, consumption…) and the reliability of the hydraulic stabilization and leveling system, the independent air-conditioning system for both environment and equipment, custom technical furniture and all other elements necessary for the proper operation of the mobile unit, enabling comfortable facilities for technical personnel in accordance with the occupational risk prevention and health plan.

In this regard, it is mandatory that every mobile unit has CE/IEC certifications for its different components, UPS with bypass and surge protection, effective grounding and current detectors/circuit breakers; fire prevention measures such as smoke and temperature detectors together with appropriate extinguishers for electrical/energy risks; emergency signage and exits in accordance with the evacuation plan; clearly marked circulation areas; proper ergonomic standards for technical personnel; cable-free floor areas and controlled access signage restricted to authorized technical staff only.

No less important are working conditions in terms of lighting (uniform, glarefree light, intensity control and production/live status indication – tally); the type of furniture and finishing of the spaces (ergonomic chairs, anti-vibration floors, adjustable desks and acoustic panels on walls and ceilings, soft and warm colors); and environmental

conditions in terms of climate/temperature and humidity control, which require professional ventilation solutions (20–24 °C, 40–60% RH).

In Spain, in 1977, the first color mobile unit was inaugurated in Barcelona for the public broadcaster RTVE, equipped with four cameras and a 24-channel audio console

With regard to the standards and regulations that must be applied in the design and equipment of a mobile unit, these are framed within the

regulations established by organizations such as SMPTE, EBU, ITU and AES, ensuring interoperability, operational stability, electrical safety and signal quality.

› Video area: SMPTE ST 292 (HD-SDI); SMPTE ST 424 (3G-SDI); SMPTE ST 2082 (12G-SDI – UHD); SMPTE ST 2110 (IP-based production); BT.709 (HD); BT.2020 (UHD); BT.2100 (HDR – HLG / PQ); and EBU Tech 3320.

› Audio area: AES3 (Digital Audio); AES67 (Audio over IP); BS.1770 and EBU R128 (Loudness measurement); BS.2051 (Immersive Audio); EBU R128 (Audio level normalization).

› Quality control and synchronization: SMPTE ST 2059; PTP (IEEE 1588) for IP systems; Black Burst / Tri-Level Sync for SDI systems; EBU Tech 3276.

We can include in this text an updated classification of the types of mobile units available in the audiovisual market. Each company that provides mobile unit services usually assigns a number to each unit it operates: MU1, MU2, MU3… and sometimes even a number identifying the number of cameras incorporated into the mobile unit, for example MU2 – 10.

The history of mobile units has traditionally used the following classification:

› Type A: Mobile units that feature a retractable side pod or expandable area (single, double or triple) added to the dimensions of the vehicle body.

› Type B: Mobile units whose working space consists solely of what is offered by the size of the vehicle body itself.

› Type C: Mobile units that could be joined in parallel, forming a set of two twin units, generating large internal working spaces and greater technical equipment potential. For this reason, they are the largest.

They have also been referred to as large, medium and lightweight (PEL) mobile units, and even compact units, integral units and large trailers. Other

non-Spanish companies use the designation Basic, Bronze, Silver, Gold and Platinum mobile units.

The global OB vehicle market size was valued at approximately USD 2.1–2.3 billion in 2024 and is expected to grow at a compound annual growth rate (CAGR) estimated between 4.5% and 6.5%

In 2018, I had the opportunity to study the national and international mobile unit market offering, which allowed me to develop a more universal typology, independent of the manufacturer or service provider:

Production mobile units (PMU) are often accompanied by communication/ transmission mobile units (TMU), where modulation, antenna and amplification equipment for the corresponding connections/ repeaters are installed (microwave – DENG –; satellite – SNG/DSNG –using two devices to uplink the signal to the satellite, Flyaway and DSNG; LiveU systems or RF systems). There are even production + transmission units available on the market, providing greater immediacy for content to reach our homes live.

A production mobile unit (and communication unit) allows coverage of highly diverse events such as Champions League football matches, national political party conventions, music or film award ceremonies, first division league matches, bullfights, various festivals, TV series, etc. An ideal solution for the new challenges of remote production (REMI)

within the audiovisual and live broadcast environment.

This article is yet another confirmation of the major technological changes within the audiovisual and live sector in recent years. A technology that has modified equipment and facilities, responsibilities and professional profiles within human teams, and production routines.

The global OB vehicle market size was valued at approximately USD 2.1–2.3 billion in 2024 and is expected to grow at a compound annual growth rate (CAGR) estimated between 4.5% and 6.5%, reaching values between USD 2.4 and 3.5 billion by 2033–2035.

(Source: “Global Outside Broadcast Trucks Market Report” and “Outside Broadcast Vehicle Market Research Report”).

In short, we can affirm that audiovisual production in the broadcast and events environment is on wheels — specifically thanks to production mobile units (PMU). 

Blackmagic’s

CollaborativeEnvironment: Muchmore thana cloudwithDaVinci Resolve

Versatile enough as to adapt to a multitude of environments and situations, let's see what functionalities are offered and how to maximize the benefits of this ecosystem

This lab test analyzes the benefits and possibilities for collaborative work offered by the Blackmagic Cloud system and its integration with DaVinci Resolve, considering different types of scenarios and needs, based on data for the European market as of January 2026. Given the international circulation of our magazine, our readers and followers from other geographical areas should compare this with the situation and conditions in their respective geographical areas.

We understand that most of our readers are well aware of collaborative work, and in fact, we are convinced that it has been part of their day-to-day for many years. However, we also take into account that it will be very likely that these collaborative environments will be quite large, built on the basis of very specific tools and very specialized professional profiles.

Surely this scenario will be similar for many of the broadcasters, whose tools will serve similar purposes, each of them with its own peculiarities, although they may differ from one another. It is also true that many of these tools are so specific and designed for such large production environments that their scope is somewhat limited to larger organizations and higher budgets.

On the other hand, there is no doubt that, since the widespread deployment of cloud-based services, many of these tools have expanded their versatility of use to a greater or lesser extent. And over time, new

ones have appeared. From a mere uploading of files in real time, to a production based entirely on cloud services, these are now affordable possibilities for small organizations and even the tightest of budgets.

In fact, ‘Cloud’ is already a familiar term for a large part of the population, and its functionalities are integrated into our work and leisure in many more aspects than it would seem to us at first glance.

But (there is always a but...) as professional use options and developers expand, certain compatibilities can be compromised without these developers having enough scope to cover all needs with new products. Especially in areas as demanding as audiovisual production.

This scenario leads us to think of the solutions offered by Blackmagic, one of the names that stands out for providing a range of hardware and software products that is wide enough to successfully tackle a large part of the needs of current productions.

Whether for large-scale production, broadcasters of all sizes, or even a mere freelancer. Not only does Blackmagic have adequate and accessible tools in very different areas, but some of them are the consolidated references in the sector at the highest professional level. Sure enough, we're referring to DaVinci Resolve.

What may not be so well known is Blackmagic's own Cloud service and its capabilities to share content and collaborate in an integrated way with Resolve. And no, we're not just talking about sharing files or projects. We are talking about working in real time on the same project, on the same content, in the same timeline, and doing so with professional specialists scattered anywhere around the globe.

Yes, it is true that there are other tools that get the job done too. But, even if it is on a small scale, how many of them allow it to be done at a very low cost, and even for free in certain circumstances? Peace of mind, and let's curb our enthusiasm: the tools for

our next production, in which we aim to be number 1 in cinema or series, will not be for free. Although it is free for small volumes of data, which even if limited for most productions, will be enough to test the capacity and adaptability of the system to our actual needs.

We are talking about working in real time on the same project, on the same content, on the same timeline, and doing so with professional specialists scattered around the globe

And here we must make the first distinction, to specify what kind of Cloud we are dealing with, although many of you have already guessed. Blackmagic's catalog of products and services has two distinct ranges: on the one hand the "Cloud Store"

 Cloud sync  Organizations
 Own storage
 Presentations
Intro cloud

and similar items, specific hardware equipment that we could assimilate to a high-performance local NAS; and on the other, the Blackmagic Cloud service, purely a software functionality.

Although both ranges are designed to provide solutions to the specific needs of audiovisual productions, our laboratory will focus mainly on the software service option, the reason for our previous note to the different possibilities that may exist in different geographical areas.

We will begin this lab test by listing the available functionalities, and then explain what benefits each of them brings in different scenarios, from the simplest to the most demanding ones.

The Cloud service can be used for free by just registering and logging in via web access. Within this web access is where we will always find all the possibilities available depending on our subscription, whether free or paid.

Although, of course, the free option has certain limitations, the entire system is designed with the actual needs of the industry in mind. Thus, it is very easy to extend the limited functionalities by paying for the subscription of our choice. And what seems even more important to us: to switch between them as our needs change, and even to return to the completely free modality whenever we do not need the highest capacities.

Blackmagic has two distinct ranges: on the one hand the "Cloud Store" and similar items, specific hardware equipment that we could assimilate to a high-performance local NAS; and on the other, the Blackmagic Cloud service, purely a software functionality

The free modality does not include management of project libraries, the number of "Presentations" is limited and storage space is 2 Gb. By the way, we comment below on the meaning and scope of "Presentations" in this environment. The rest of functionalities are fully operational.

As for the subscription model, there are different combinations that provide enough flexibility when configuring the services we may need for our different productions.

Starting to see possibilities in its simplest use, we value the fact that this environment can be used autonomously and independently, since even outside of Resolve it behaves as a file repository with a folder structure, which will be accessible through a username and password, as is the case with other Cloud services.

This possibility would already allow us to work on our project from different computers or different locations. The big difference from other generic cloud

storage services lies in the ability to seamlessly move large volumes of data and large files. Speed is only limited by that of our own access to the network.

The basic modality already allows us to link our Blackmagic cameras to our Cloud and determine through configuration if we want to upload our recordings to the cloud. And we can confirm that when we talk about Blackmagic cameras we are also including the Apps that we already use (for free as well) on our iOS or Android mobile devices.

Depending on the cameras, settings can be tweaked for uploading content: only proxies, or both formats. In addition to facilitating the existence of metadata already preloaded, which makes cataloguing and management of content from different cameras an easier task.

Naturally, depending on the recording formats and network connection, uploading these files will require different time spans. Uploading only

proxies and with a good connection the files will upload in just a few seconds.

Once we have our content uploaded to the cloud, we can rely on a feature that we find especially attractive, which is the possibility of previewing the uploaded content without downloading it by simply dragging the mouse pointer over each clip's icon.

Once we have our content uploaded to the cloud, we can rely on a feature that we find especially attractive: the possibility of previewing the uploaded content without downloading it

As for "Presentations", it is the name given to the tool enabling to create viewing links to share content with external users, such as directors, agencies, or clients, for example. Not only does it allow viewing, but it supports notes and comments, linked to the video images, which will automatically become bookmarks with annotations in the timeline

once said content is uploaded to Resolve. The free version allows creation of up to 20 Presentations and having up to 30 collaborators.

As for these invitations, it is worth noting that there is no sending of files of any kind, the content is always displayed online. In this way, risks conserning the security of our contents are minimized, since there

is no distribution of files of any kind, nor do they leave our environment.

Even if we do not have our own project libraries, it will be possible to access other users' libraries through our user if they invite us to participate. In this way, it will be easy to create multidisciplinary work teams with different people dedicated to different tasks, without increasing costs.

Not having invested a single penny yet, we already have in our hands tools that not many years ago were only accessible to large productions with the largest budgets.

The big difference from other generic cloud storage services lies in the ability to seamlessly move large volumes of data and large files

If we additionally have any of the Blackmagic storage hardware devices (Cloud Store type), this hardware will synchronize the files with Blackmagic Cloud (and this can also be done with other storage services such as Dropbox and Google Drive).

Once we have verified that the platform meets our needs, it is time to take make the most of all its possibilities and benefit from all the advantages of full integration with Resolve.

Simply by adding the option of project libraries to our subscription, we have access to not only the project as such (the .drp file) residing in the cloud, but we can also configure shared access to certain projects so that different users can work on them simultaneously. A library is an environment for hosting projects, capable of being accessed by 30 users simultaneously, while the number of simultaneous users in the same project is limited to 10.

In fact, in the Resolve project manager where we usually create our libraries, we always have shortcuts for the three hosting possibilities: local, network and cloud. Considering that a single library is capable of hosting up to 1,000 Resolve projects, we may not need many more, unless we may want to compartmentalize content for whatever reason.

And in terms of space, above the free 2 Gb, the next option already offers us up to 500 Gb of storage. We deliberately omit to make reference

to prices, which we invite you to check in your own geographical location, as they may differ among them. It will be interesting to check this through the "update plan" option in your profiles, and compare with other commercial shared storage options.

This storage can be adjusted month to month, increasing or decreasing it according to our needs. Of course, to reduce it, a requiremment is that space to be cancelled in our plan remains free. That is, if we want to go down from a 2 Tb plan to a 500 Gb plan, we will not be able to have more than those 500 Gb used when the change takes place.

Any modality other than the free one already allows for at least one cloud library. Once in place, when creating our project we will have a dialog window that will offer us different alternatives, such as location of media, whether we would like to configure it as shared, media synchronization, and allowing remote camera access.

With our project already created and configured as a shared project, we can invite other users to participate. Then, access will be managedd through Resolve itself, allowing the editing access to the elements that are being updated only for its user, but maintaining the reading access for the other

elements and users in the same project.

If we have storage space in the cloud and we have selected to store the media there, the instant we upload content to Resolve through the Media Pool Manager -in the same way we always do- we will see how they begin to upload to the cloud. Small icons on the clips

will allow us to know the status of each individual file.

By then, and as a simpler option, we will already have the possibility to collaborate with ourselves. This means that it is possible to access the same project with the same contents from any computer that has Resolve installed by accessing with our cloud credentials.

In a slightly more elaborate scenario, we could invite specialists to work on our own project without sending files or projects, and always keeping the content updated for all participants. And having all the collaboration options that we will discuss below.

Going up to the most sophisticated scenario, we could also have an operator uploading media files, several editors working on different timelines, colorists making the necessary adjustments to the same clips that are being edited, audio and sound editors in their field, visual effects technicians on the other side, and a supervisor making annotations and comments and coordinating the entire project.

Whether it is an operator uploading files, or if it is the cameras themselves that are synchronized, the contents will be automatically available in Resolve as soon as the files have finished uploading. That is, we could start editing a news story while the field team is still generating content. It is possible to preview

the portion of the contents already uploaded, even while they are uploading.

Regarding management, Resolve provides a system of automatic locks so that, when an operator is working on a part of the contents -for example in a folder or a timeline- the modifications regarding that part are restricted only to the relevant user, although

In most scenarios it will be advisable to work with proxies, as they will quickly download to the local device and we will enjoy a much smoother editing. To do this, there is an automatic proxy creation tool that facilitates the work

the others can continue to view the contents, but with access in read-only mode. This situation is easy to identify, as others will see a lock icon indicating that the relevant folder or timeline is under the control of another user. When this user finishes and leaves that folder or timeline, the lock disappears and an icon to refresh the content will be displayed so they can have it instantly updated on their computers.

This ensures that there are no conflicts and no loss in flexibility or efficiency. Once the task is completed and the content has been released, any other participant assigned to the project has immediate, free access to continue making changes to the same content already updated.

In this regard, in most scenarios it will be advisable to work with proxies, as they will quickly download to the local device and we will enjoy a much smoother editing. To do this, there is an automatic proxy creation tool that facilitates the

work and makes content managament much more agile. Only when doing the final render wiil it be conveniento for us work with the original files, to preserve the highest quality in our production.

And how do we get these originals? Either because they have also been uploaded to the cloud, or because they have been provided to us in some other way. But in those cases (frequent in your environment?) in which the originals only exist in the cloud and are extremely bulky, how do we generate that final render with good quality?

It's simple: once we have a timeline with proxies assembled, Resolve has the ability to recover only the necessary fragments, extracting from the registration files the subclips that are necessary for the final shaping. This is the partial media recovery option and the action performed automatically when selected. This way we have the necessary content without mobilizing more data than necessary.

At this point, within Presentations we have several tools that facilitate collaboration: the chat for participants with group videocall facilitates internal and immediate communication between all team members. The above coupled with the aforementioned invitation to other external users such as producers or customers, to provide their comments.

These comments from external guests are collected as bookmarks with notes, which are associated with the media's frame, or at the precise instant on the timeline that the guest has determined, making the review and approval process as agile and efficient as the indications given by our producers and customers. Since we have an index of these markers, there is no need to search for them throughout the project.

Having also indication of its author, it will be easy to manage and prioritize them with the appropriate criteria in each case.

The fact that these comments are not simple text but are linked to bookmarks, has multiple advantages. The main one is that being associated with an instant of time on the clip frame or on the timeline makes them stay in the right place, even after all the modifications that the production may undergo.

Access to all the possibilities offered by the first subscription option, with a minimum of one library and 500 Gb of storage, is something to keep in mind whenever our production may require a team of more than one person for everything.

In other words, most of the time

It is also important that you do not dive into texts of emails or messages and then relate them to the place to which it

refers. We always have a list of new bookmarks and related authors, which makes it easier to find the changes to be made and the spots they relate to, while preventing duplicate comments.

As for bookmarks, project users can create shared bookmarks, accessible to the whole team to facilitate coordination and input. And also private bookmarks -those which only they can see- in order to facilitate work without overwhelming the rest of the team with information probably irrelevant to their activity.

It is also important to note that only the owner of the library being shared in the cloud needs to subscribe to libraries in their profile. The rest of the work team members require a cloud account, although it can be a free one.

As work progresses -and having work teams of a certain size in mindwith professionals specialized in the different functions involved and a coordination/supervision team (or person), there are

tools that are interesting, such as comparison of timelines.

This allows viewing the two timelines -before and after the changes- in order to facilitate the review

of all the modifications made. In addition, they may be subject to approval prior to final validation to ensure that progress is always made on approved modifications before moving forward.

Another advantage that facilitates shared management is that certain project configurations are managed at the local system level. Thus, each person can configure their cache or monitoring parameters to optimize performance on their different devices.

As for the work in the color module, the locks that we described before for folders

and timelines occur at individual clip level, so that even while the editor continues to work on the sequence from the Cut or Edit modules, one or more colorists could be collaborating simultaneously in the same assembly. This without interfering and crossreferencing the result of the work done by the rest of the team.

Finally, Resolve's real-time save option is also available on these projects, thus ensuring that work already completed is never lost.

Access to all the possibilities offered by the first subscription option, with a minimum of one library and 500 Gb of storage, is something to keep in mind whenever our production may require a team of more than one person for everything. In other words, most of the time.

And for much larger organizations, there is also the 'Organizations' option within the cloud, which allows to maintain centralized control over different work groups

that are collaborating on projects that you may want to keep completely independent of others.

In that sense, and before finishing, we should add that this cloud access provides some other interesting options, designed for large organizations, such as the possibility of renting Resolve licenses.

Yes, renting. That is, hiring licenses for temporary use, thus facilitating the scaling of work teams and resources up or down depending on the variation in the number and volume of projects. In this way it is possible to have a huge license

The integration, flexibility and customization options make the cloud platform a more than interesting option for practically all production levels

pool at certain specific times without requiring an investment that would remain unused during the times in which they may not be necessary.

As we have seen, integration, flexibility and customization options make the cloud platform a more than interesting choice for practically all production levels. From independent freelancers to large corporations, they all can benefit from its possibilities. Especially interesting and attractive is the ease to expand or reduce options in the subscription with the same in a way as responsive as our changing needs. 

Turn static files into dynamic content formats.

Create a flipbook