Jump to content




Featured Replies

rssImage-10eb11a7b9efbfbbe6a2bf3baa11db07.webp

Several times during the men’s final of the Madrid Open tennis tournament between Casper Ruud and Jack Draper last spring, TV viewers were treated to a remarkable camera perspective. They watched the match from just behind the baseline, effortlessly following the player’s movement step for step and glimpsing his perfect angle on the ball with every shot. 

With no discernible blur or delays, the smoothly flowing live footage had the hyper-real feel of a video game. 

Tennis TV

“I love the footwork by the cameraman,” wrote one YouTube commenter. 

The company now uses the comment in its investor pitch deck. 

In reality, these uncanny tracking shots didn’t involve any human camera operators at all. No robotic cameras or drones, either. Instead they were generated, in real time, with a software-based camera system developed by startup Muybridge, based in Oslo.

Founded by Håkon Espeland and Anders Tomren in 2020, Muybridge has spent nearly five years developing real-time computer vision technology that uses software to create a “weightless” camera, with no moving parts, that captures the speed and motion of live sports in a way that our eyes aren’t accustomed to. In the coming year, viewers of televised sports will get to see many more of these revelatory perspectives—both in tennis and beyond.

01-91473380-muybridge-camera.jpg

Muybridge has shifted the paradigm—twice

“Four hundred years of camera history is ending here,” explains Espeland, standing beside a framed black-and-white portrait of motion-picture pioneer Eadweard Muybridge, the company’s namesake, at the company’s headquarters in Oslo’s hip Grünerløkka neighborhood during Oslo Innovation Week last fall.

“I see a lot of resemblance [in what we’re doing] to what he did with sequenced triggers to actually create motion” says Espeland. To create his groundbreaking images of a galloping white horse in the 1870s, the English American photographer set up a line of cameras that were triggered by a trip wire as the horse ran past them, creating multiple images that each captured a different phase of the horse’s stride; by overlapping the images, he made a picture that appeared to move. “It’s a similar way of thinking,” says Espeland. “How can you distribute sensors and use that data in a smart way?”

Espeland had a long history with automated systems; he started working on them as a 16-year-old apprentice on oil and gas rigs in the North Sea. After getting a master’s degree in cybernetics and robotics, he joined a Norwegian company building robotic camera systems for live TV production. While there, he had an epiphany. “With computational photography, we could get rid of 300 kilos of metal and robots,” he says. “It was like removing gravity. We’re not covered by any physical limitation.”

Instead of using big, expensive cameras that you move to “chase” whatever’s happening on the court or sports field, Muybridge puts hundreds of small, inexpensive video sensors all over the place—and uses software to create smooth tracking shots and conjure any angle on demand. 

04-91473380-muybridge-camera.jpg

In practice, this looks like extra-long speaker bars packed with a row of oversize smartphone camera lenses. These arrays come in two-meter lengths that can be connected to form what amounts to a single continuous camera of virtually any length. “We’re going to build future digital stadiums full-360,” says Espeland. 

And unlike traditional cameras, which can obstruct spectators’ views at live events, Muybridge’s clamp unobtrusively to any wall or structure, capturing the action on the court, field, or rink unnoticed. “Our biggest issue at the U.S. Open was that the coaches of the athletes sat on it,” Espeland says. “They didn’t realize it was a camera along the ad boards.”

Made from commodity electronics components, the sensors themselves are relatively inexpensive. “We are lucky that the consumer [electronics] and mobile industry consume so [many] cameras,” says Espeland. “They’ve taken the costs down. There’s a reason why there are three cameras on an iPhone now.” Mobile phone makers have also advanced the capacity of computational photography, keeping the sensors largely unchanged while improving algorithms to create better pictures. “We’re piggybacking on that.”

To meet the demands of live broadcast, Muybridge brings an updated approach to the reconstruction of 3D images. “The rest of the world has been throwing more and more compute at the problem, running math on the GPU layer to try to fill in the blanks,” explains Espeland. “That’s led to something much faster than it was 20 years ago, but it can still take eight minutes to process the images for a replay. Our focus has always been [doing it] in real time, and we wanted it to be able to run on a laptop, in the cloud, or on a mobile phone.” 

That’s where all of those little cameras come in. “We have more pixels, more angles, more overlap,” says Espeland. “That allows us to have a cleaner mathematical approach to determine exact color, perspective, and all of those things. Everything is backed by pixel data—we don’t do any approximation.”

02-91473380-muybridge-camera.jpg

Finding the camera angle

Tennis has been an effective launchpad for the company’s technology. “When we lowered [the cameras] all the way down to the lowest ad boards, social media just exploded,” says Espeland. Muybridge systems were deployed last year at the Miami Open, the Madrid Open, and the U.S. Open. The company has an exclusive partnership with Sony, through its live sports subsidiary Hawk-Eye Innovations, to power all of the ATP Masters tournaments in 2026 (which kick off March 4 with the BNP Paribas Open in Indian Wells, California). “I guess I can say that we will be seen in nearly every tennis tournament [this] year.”

Now the company is targeting additional sports. The key is finding unique perspectives where the technology’s value proposition becomes obvious, providing a vantage point that makes the sport better when you watch it at home than in the arena.  

For soccer—Muybridge recently ran a test that went live on air with Sky Sports in Germany—that could mean behind the goal and even in the goalposts. For Nascar or Formula 1, producers might actually ring the entire track with sensors (though early discussions have focused on capturing critical turns and pit stops). For baseball, viewers could look out on the field from the dugout.

For hockey—Muybridge is currently working with the NHL and Fox Sports—cameras could be set in the dasher boards, along the ad boards, or up in the concourse to create a “virtual drone” that appears to zoom around the rink from above. 

Crucially, “there’s no speed limitation” with Muybridge, Espeland says. “You can instantly move to wherever you want, and we’re creating all of the millions of pictures in between, just like our eyes do.”  

03-91473380-muybridge-camera.jpg

“Muybridge inside”

Sports, for Muybridge, could just be the start. The company is currently involved in a pilot program that installs its cameras on the ceiling and walls of ambulances, allowing a remote ER doctor with a VR headset to virtually “move around” a patient to evaluate them. 

Security and surveillance represent additional avenues for potential VR expansion, as does an IRL version of the metaverse. “VR headsets never really took off because we always have to visit this virtual world,” Espeland says. “We jump into a room, you’re an avatar, I’m an avatar, but we want to interact with real people.”

News broadcasting and other live studio productions are another developing use case. The CBS Morning Show ran a test of Muybridge’s technology on its New York set in December 2025.  

Moving forward, says Espeland, he has an “Intel inside” philosophy: “We have the core technology, and we look for partners who can represent the next strategic product and bring it into the market.” 

View the full article





Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.

Configure browser push notifications

Chrome (Android)
  1. Tap the lock icon next to the address bar.
  2. Tap Permissions → Notifications.
  3. Adjust your preference.
Chrome (Desktop)
  1. Click the padlock icon in the address bar.
  2. Select Site settings.
  3. Find Notifications and adjust your preference.