Enigma: We’d love to talk more about personalization and more broadly about how data science and analytics inform your engineering and design process.
Brian: We are definitely becoming much more data driven. We probably used to be much more intuitive about how we did things. That is the history of the newsroom, it relied on an instinctive sense of what readers want. Our product team for a long time kind of came from news backgrounds. They were very intuitive product managers. We now have a much more data minded product organization supported by a much more robust data and analytics team internally. We absolutely look at how are people actually using the site, rather than simply guessing that younger people tend to be on their phones more. We actually know these things now in a better way. That is helping us—we are doing A/B tests to try things out and really get a better sense of what's adding value, how are people using it, and then how can we evolve things in a way that makes that better.
Enigma: The Times is well known as being an editorially driven organization. However, there are trends among Facebook and other tech platforms of letting algorithms curate newsfeeds. We’re curious how data analytics come to inform the editorial side of things at The Times?
Brian: Well if you look at the homepage as an example, there is no world where you would get to a place where the entire page is algorithmic. That's not our mission. Even so, The New York Times publishes something like 250 things a day. That could be a story, that could be a video, but we publish about 250 pieces of new content a day. If we just let an algorithm take over, it's entirely possible that for some people the entire home page could just be arts and sports.
But that's not a reflection of the world. That is part of our mission: to be a reflection of what is important.
One of the things we definitely hear talking to readers and subscribers is there is a value to having a shared experience of The New York Times. What you see, what I see, what Donald Trump sees, and what Barack Obama sees on the top of The New York Times homepage is the same thing. There is value to that.That's not something we're looking to change, even if that's not the story people want to see on the top of The New York Times home page, it's the one that's going to be there because there is an editorial judgment, there is value to that.
Where we would look to algorithms is to basically enhance the value that we think we're providing, either by giving editors tools to know if a story has lost interest, because there are stories that we put there just because they're interesting, and maybe a feature story that we've written about somebody.
Right now, it's a very rough approximation by editors. They'll put the story up in the top of the homepage for a couple of hours, and then by some amount of judgment decide it's been up there long enough and put another feature story up there. That's a really crude way of basically saying "We want the people that have not yet seen it to see it, but we want the people who have seen it to see something new." Those are the kinds of places where we can use data, we can use personalization to solve that problem in a much more elegant way.
We also have tended to assume that most people come to The New York Times a couple of times everyday. That's not reality. So how can we make sure that something we published four days ago in a four hour slot on our homepage gets resurfaced to somebody who clearly would want to know about that?
For us it's like, the way that we're using personalization is probably very understandable by users. We’re actually debating now if we need to make it transparent to users, and do we need to make it knowable that something is happening. I think the ways that we're looking at it right now are pretty easily explainable.
Enigma: How has The Times been thinking about virtual reality?
Brian: I think a lot of those things like virtual reality, augmented reality we view as tools for a different way of telling stories that we try to find ... they have inherent business value because they're the kinds of things that people like to sponsor, and get a lot of press, and you can win awards. There is something about the format that is compelling. From a journalistic perspective, it's basically like there are certain stories that that is the best way to tell that story, and you can tell stories in new and different ways. That's the way that we really think about it. I think with VR because it is such a high touch, expensive way to tell stories, we were only going to be able to look for a handful of stories in any given year that we want to tell that way. We were looking for things that we want to be ambitious about.
Enigma: What are you guys doing in augmented reality?
Brian: With augmented reality, we've done one thing so far that was around the Olympics where we essentially had several athletes that you could kind of project into your space and interact with and walk around and see how they're doing. We have some other things that we're working on. Augmented reality in a way is just a way to make graphics very human-scale so you can actually interact with them. I think our graphics and interactive folks are really excited about some opportunities to tell stories with augmented reality. Virtual reality too. They're definitely looking at much more interactive virtual reality in particular as a way to tell stories. There is no world where either of these replace or become the default thing that we're doing, because they are still pretty high production effort but they're very compelling and a lot of fun.
I feel like any time a new technology comes out we spend a couple years testing all of its limits. Like I remember when Google Maps, sort of really when they were actively developing their API, there were a lot of map interactives. Then, when charting libraries came out, people started making lots of charts. The tools and ambitions sort of parallel. Where now virtual reality and augmented reality in particular, I think there is going to be a huge technical advance over the next couple of years where it will make it so the WebGL, the web versions of that are much more performant. We can do these where we can just publish them out. The sensors on the phones are going to get better so that we can just push the limits. Each time this happens, it just gives us ideas and things that we'll experiment with. Eventually, it'll become kind of normal and we'll probably use it only when it really, really, really is necessary.
Don’t miss the final installment of our interview—the next post will go over engineering culture and hiring at The New York Times, plus Brian’s own advice to engineers just starting out.
Interested in joining the Enigma team? We're hiring.
Hero photo: Nic Lehoux