What's Next? The NFL's Magic Yellow Line Shows the Way to Augmented Reality

Update: Amazon just released Lumberyard, a free AAA game engine deeply integrated with AWS & Twitch.

What’s next? Mobile is entering its comforting middle age period of development. Conversational commerce is a thing, a good thing, but is it really a great thing?

What’s next may be what has been next for decades: Augmented reality (AR) (and VR). AR systems will be here sooner than you might think. A matter of years, not decades. Robert Scoble, for example, thinks Meta, an early startup in AR industry, will be bigger than the Macintosh. More on that in a later post. Magic Leap has no product and $1.3 billion in funding. Facebook has Oculus. Microsoft has HoloLens. Google may be releasing a VR system later this year. Apple is working on VR. Becoming the next iPhone is up for grabs.

AR is a Huge Opportunity for Programmers and Startups

This is a technological revolution that will be bigger than mobile. Opportunities in mobile for developers have largely played out. Experience shows the earlier you get in on a revolution the better the opportunity will be. Do you want to be writing free iOS apps forever?

It’s so early we don’t really have an idea what AR is or what the market will be or what it means from a developer perspective. But if you watched the Super Bowl you saw an early example of the power of AR. It’s the benign looking, yet technically impressive, computer generated yellow first down line marker.

Augmented Reality is Already a Sports Reality

Most people don’t realize when they watch NFL football they are already experiencing augmented reality. We don’t see a real game anymore and we haven’t for some time, it has become a synthetic experience. Every frame put out over the air is the product of a host of computer algorithms operating on inputs from a real-time data stream.

We already take this for granted, so much so people think the yellow line is actually part of the field.

pocketchange2247: I was so confused my first time going to a Bears game. There were no announcers and there was no first down line or anything. I had never seen a game without those things so I just assumed they would be there.

That’s what AR will feel like to people, natural, real, and indispensable, only times a million as even the beginning personal AR systems will be much more ambitious in how they alter the world.

AR is so effective because our brains already virtualize the world. A stream of sense impressions are weaved together and presented to the mind as reality. AR insertions are just another data point to integrate. Programmers have a huge part in how that new reality will be created, as they always have.

How does the yellow line work?

Vox put together an amazingly produced video: The NFL's magic yellow line, explained, that explains with great graphics and narration how that thin yellow line is added to NFL broadcasts. The process goes something like this:

Sportvision, the original inventors of the yellow line technology, make a 3D mathematical model of the field using laser surveying tools.

During the game data is gathered in real-time from cameras about pan, tilt, and zoom positions for every single frame.

If the down is at the 43rd yard line, for example, computers combine the camera data with the model of the field to draw the yellow line in the proper perspective and to redraw it for every frame.

When the line is obstructed by players the line must be removed so it looks like it’s underneath the player, like it’s painted on the field. This is done by sampling colors. Consider the field as a giant green screen. A map is made of which shades of green and brown are in the field given the lighting conditions of the day. Those are the colors to be covered by the yellow line.

Colors in player uniforms are also identified so they will never be covered by yellow. It only fails in the most extreme weather conditions, like a blizzard. Even then yardage markers are inserted into the frame.

All this processing only delays the broadcast by less than one second.

And this should not be a surprise, but it was to me, this technology is being used to insert ads into sports stadiums. The NFL doesn’t insert ads, but other sports do. These ads look like they are part of the stadium or are on the field. They are called PVI (Virtual Insertions). Only people watching the video feed see the ads, people in the stadium have no idea that they are there or that viewers are seeing them.

In 2001 the yellow line cost $25,000 per game to make.

What Opportunities are There?

Even from the simple yellow line example we can start to see where programmers can add value. There’s an IoT play. As everything becomes enchanted with CPUs and communications ability these new devices become potential inputs into AR systems. What do you do with all of this data is a question. There’s an AI play as verbal commands will be natural and we still don’t have a programmable Siri type system.

You can think of your AR glasses painting on the yellow lines instead of the broadcaster. Broadcasts will broadcast video + data streams that are combinable into a presentation. You can imagine a lot of good apps around that sort of thing. Though we are in serious danger of closed gardens here as the content owners will want to capture all the value by controlling the experience end-to-end. That will be sad, but inevitable. What will business models look like?

There’s always ads. That virtual ads are already being inserted into games makes that clear enough.

Then there’s porn, films, and games of course. What those will look like is something that will take a long time to figure out. Digital fashion will be huge.

If experience is any teacher social interaction will be a huge driver. People like to communicate with each other above everything else. What will messaging look like? Will a social network like Facebook make any sense in an AR/VR future? An AR/VR version of Tinder is scary to contemplate. But being able to see and potentially interact with goods on an AR/VR version of eBay/Amazon/craigslist will be cool.

We’ll have to figure out how to program these shared large scale virtual worlds. That will be a huge challenge. What new hosting services will evolve? Pressure will be put on every part of the compute, network, and storage infrastructure.

Control systems for cars, washers, dryers, that sort of thing will be quite useful. Certainly better than the mobile experience.

Imagine what the world will be like in the future when people take off their AR glasses or pop out the AR contacts. It could be quite the let down. We’ll need ways of dealing with that.

This is all so feeble, I know. It’s hard not to be trivial when thinking of all the potential opportunities. And there’s nothing less satisfying than projecting what we already have into a future technology that will change everything. Thankfully we have programmers to explore this undiscovered country. It’s time to start thinking about what route you might take.

On HackerNews

Oculus Developer Kit / Meta 1 Developer Kit

26 essential science fiction novels to get you ready for tomorrow

I was an associate producer on Monday Night Football and ABC College Football Saturday for four years. I was primarily responsible for the yellow first-down line. AMA

The Yellow First-Down Line: An Oral History of a Game Changer

NHL introduces puck- and player-tracking technology

Logopaint - the guys put the virtual ads in games

The Magic of LiveLine - AC UNCUT

TIL that it takes eight computers, four operators, and a computerized 3D model of the field just to make the little yellow first-down line in TV football games.(R.1) Inaccurate

Stan Honey combines work with play for new sailing technology

Sports: 3-D TV’s Toughest Challenge

Football Game Technology The Virtual Yellow 1st and Ten Line

The NFL yellow line explained