As far back as I can remember, the weather has looked pretty much the same on TV. Sure, the weatherperson might change and the weather conditions do vary, but the format consisting of one person standing in front of a display that shows a flat, abstract representation of the forecast? That’s as unchanging a part of the television landscape as anxiety-inducing news broadcasts and repeats of Friends.
But try saying “if it ain’t broke, don’t fix it” to the folks at The Weather Channel. The channel this week introduced a new “Virtual Views” twist on television weather. It uses mixed-reality technology to transport the network’s on-camera meteorologists to various cities to show off the weather before it happens. The innovative Virtual Views IMR segments are being integrated into The Weather Channel’s daily live programming and forecasts, and feature landscapes and backdrops from abpout 50 cities around the U.S.
“Weather broadcasts are unique in that because we are providing information on future events. there is no video footage of that event,” Mike Chesterfield, senior director of weather presentation at The Weather Channel, told Digital Trends. “While traditional newscasts have the ability to show video of what happened as a clear way to tell a particular story, we often do not have that advantage because the events we are providing information on have yet to occur. By creating hyperrealistic simulations based on science and hard data, and providing the expertise provided by the meteorologists who are immersed within these environments, we now have a future-facing video product that allows us to more clearly convey the messages that we are trying to get across.”
Chesterfield described the feature as a “real revelation” that allows weather presenters to show, rather than simply tell, audiences what the weather is going to look like over the next few days.
Warren Drones, senior technical Aartist at The Weather Channel, explained that the setup still involves the classic greenscreen weather setup memorably shown in the movie Groundhog Day. However, the graphics are provided by an Unreal Engine-based real-time rendering system that allows for the realistic 3D graphics — and some carefully implemented camera moves.
“While the graphics are displayed on this engine, the position of the camera is sent as data to the Reality Engine, syncing the view of virtual elements to their positioning in the greenscreen space,” Drones told Digital Trends. “Maintaining and driving these complex systems during live shows can only be done when everyone involved has an understanding of how the tools work and what they can do.”
- You won’t be taking Microsoft’s HoloLens 3 into the metaverse
- Apple’s mixed-reality headset could be delayed yet again
- Microsoft and Samsung could team up on new AR headset
- Apple’s mixed reality headset could be half the weight of other headsets
- Microsoft unveils Mesh, and with James Cameron, dives into mixed reality