Eric Goldsmith (AOL)

Podcast Transcription

Joshua Bixby: Hello. It’s Joshua again. President of Strangeloop Networks.  Welcome back to the next installment in the webperformancetoday podcast series. When I was at Velocity in London in October, I had a chance to sit down with Cliff Crocker.  Cliff, who is a friend, has a really neat perspective on performance because he has worked on both sides of the wall.  He has spent time as a consultant and worked at Keynote, then crossed the wall and started to work at Walmart where he was the senior engineering manager, which is where I met him selling front-end optimization into Walmart.  We became friends. We were both driving towards the same goal, which was trying to show Walmart that performance mattered. It was friendship that struck up there and has continued through his new role where he is VP Product at SOASTA, which is a cool company and I’m gonna spend a bit of time talking on the podcast about them in one of the podcast coming up. I got a chance to talk to Cliff about his experience in all three companies, the emergence of big data, among other things. I hope you enjoy. Here at Velocity Europe with my friend, Cliff Crocker, formerly of Walmart fame, currently of more famous SOASTA fame. Hey, buddy, how are you doing?
Cliff Crocker: Doing well, doing well. How are you?
Joshua Bixby: I’m great. Thanks for joining me. Tell me about Velocity Europe. How is it going for you?
Cliff Crocker: It’s my first time to Europe period and to come here for a web performance conference I think just really makes it kind of the best trip that I’ve been on for a while. It is going great.  I think that Velocity Europe seems to be raising the bar from what I have seen in terms of the talks and the presentation and slides that came out of Berlin last year and then the quality of the talks and the people here speaking today, just, it’s amazing. It’s going really well.
 Joshua Bixby: Nice. See, they’re loving us in the background too. We’re actually, like, right in the middle here, you can hear the applause from the crowd.
Cliff Crocker: Yeah.
Joshua Bixby: That wasn’t for us, but…
Cliff Crocker: I think it was.
Joshua Bixby: You think we can take credit for that? People are filing past us as we’re chatting here. We’re sort of tucked into corner. I want to rewind to Keynote days.
 Cliff Crocker: Yeah, absolutely. 
Joshua Bixby: How long were you in Keynote? 
Cliff Crocker: I was at Keynote for six years, I believe.
Joshua Bixby: And you started as?
Cliff Crocker: I started as actually doing load testing, as a load-testing consultant. I was coming in to do web performance consulting, but interestingly enough that team wasn’t growing and wasn’t big enough at the time, although a really good friend of mine there, Ned Rushlow [Phonetic] [0:02:22] was doing a great job, kind of, early days doing evangelism, so I jumped onto the load testing team and worked there and did load testing for several e-commerce sites for a few years then switching over and switching into web performance consulting.
Joshua Bixby: So, tell me about the early days of web performance consulting. This would have been, what, 2008?
Cliff Crocker: Yeah, I guess it would’ve been 2008-2007 potentially.
Joshua Bixby: How was the world different back then than it is now?
Cliff Crocker: Well, it is interesting. I think, this industry moved so fast; however, I think, honestly, we still have so many of the same things that we’re still talking about today that we were talking back then. So, it was interesting. I think people kind of were not as attuned to the fact that performance had an impact on the bottom line, so it was a bigger challenge back then than it is today, but I still find myself having the same conversations that I had, you know, six year ago.
Joshua Bixby: So, fundamentally, the similar dialogue?
Cliff Crocker: Absolutely.
Joshua Bixby: Hey, speed’s important, why, maybe you didn’t have the artillery before to tell people, now there seems to be a lot of artillery out there, but…
Cliff Crocker: Yeah, and I think that the performance of browsers back then made it a lot easier to be a consultant, because there were so many best practices that you could take into account and things you could do and now that the browsers are getting better and faster, there is obviously still a lot of optimizations that hold true, but at the time it was a lot easier to kind of go through your checklist prior to or right around the YSlow days and things as they were coming out and right around, you know, the book, when the book was published.
Joshua Bixby: The book. I like that. Steve would like that. What were your lessons from those web consulting days, like, what do you take into SOASTA now that you guys are doing, you know, also in the performance business, what from that time, when that new kid comes into SOASTA and works in your division and you’re saying, listen man, over a beer, let me tell you about the good old days, there were some key lessons I learned, what would you share, grandfather like?
Cliff Crocker: I think the biggest thing, because interestingly, and I will rewind for one second, before SOASTA, after leaving Keynote, I went to Walmart, so really I was able to, and I knew this was part of my journey and part of my career is that I wanted to hop on to the other side of the fence and kind of, you know, see what it was like and what I have been telling people all these years, what were the real challenges, why was it so hard to, you know, combine the JavaScript or, you know, do any number of optimizations that should be simple and easy to do, so I wanted to get a taste for why that was hard and I think what I learnt at the enterprise level and specially when we’re dealing with a site that is so large and sort of forced to moving so slow, there is a lot of things you learn about patience and there is a lot of things you learn about choosing your battles and it is the same old truce, so now that I’m a vendor again and we’re back on that other side, I think that, you know, you can’t make the assumption that you know exactly what’s going on, on the other side of the fence and just because something is easy to do in practice doesn’t mean that it is easy to do in process.
Joshua Bixby: Yeah, and it doesn’t mean they’re idiots.
Cliff Crocker: Yes, exactly, right cos they’re not, I mean, they’re really smart guys, I mean…
Joshua Bixby: So, you evolved out of keynote. How did you get the Walmart gig?
Cliff Crocker: Actually it as a guy that I had been doing consulting with on the Walmart side that finally said, man, why don’t you just come over, just come over here and do this for us, so a lot of that was because they wanted to focus a lot of load testing as well and sort of build up a center of excellence, but, Subir, my boss at the time, Subir Sengupta, brought me in and was just great and kind of gave me a bunch leash and just let me run and build a team and really work on sort of brining performance in as a culture more than anything at Walmart.
Joshua Bixby: And you guys did some amazing stuff. You were there for two years, three years?
Cliff Crocker: Yep, two years.
Joshua Bixby: I mean, you guys, as far as I can tell were one of the leading, if not the leading organization looking at this stuff.
Cliff Crocker: Well, thanks. I think definitely we tried to raise the level of awareness and we tried to, again, change the culture and I think that we got some sponsorship, executive sponsorship really, interestingly enough, not in the engineering side, but more on the business side that said, you know what, this is important, you guys run with it. It’s okay if you go and talk about it. It’s okay if you go and talk about the fact that Walmart isn’t the fastest site on the Internet and the things that we’re trying to do to fix that, so I think it was about community, it was about awareness. A big huge thing for me was actually the web perf meetups that are happening and specifically the San Francisco web perf meetup that Aaron Kulick founded, so I got to meet Aaron very early on when I was actually looking for performance engineers and then started really getting tied into that community that’s where I met Buddy who you’ll talk about from LogNormal, you know, the company that SOASTA has just acquired, as well as Philip Tellis and then, you know, Aaron Kulick who is, you know, a very dear friend and still fighting the fight with Walmart.
Joshua Bixby: Brilliant, brilliant man.
Cliff Crocker: Yes, he is. Yes, he is.
Joshua Bixby: Where did vendors go wrong at Walmart, I mean, as you cross the bridge and were on the Walmart side, you were attacked by vendors, of course, you were a target. What did they do wrong? What were the flaws that you saw continuously from enterprise sales guys trying to sell to Walmart?
Cliff Crocker: You know, I guess, there was a lot assumption, there was a lot of fear tactics that were tried, right? I mean, the vendors coming in and saying, you know, are you just gonna let your site crash or you just gonna this, the other guys are, you know, are better than you, they’re faster than you. I think that that there was just not as much empathy and there wasn’t as much trying to really understand the positioning and understand that, you know, Walmart is a large organization, they’re not gonna move extremely fast on the cell side, but that shouldn’t discourage them from actually coming in and trying to work with them and I think patience was probably the biggest thing.
Joshua Bixby: Just didn’t last…
Cliff Crocker: It’s a long sales cycle and I hate it on this side, I hated it on the Keynote side, but I understand where it comes from.
Joshua Bixby: I know what it is like to sell into Walmart. You know that.
Cliff Crocker: Absolutely. Absolutely, you do. You do. But, you know, you guys kept smiling and there definitely was a lot of patience.
Joshua Bixby: I think it’s the Canadian side of us.
Cliff Crocker: I think it is.
Joshua Bixby: Because there is a side of that. Coming back to Walmart side cos I am always fascinated by the business model that Walmart has, forget the technical side, which is using some of this data, working with vendors very closely to optimize the shopping experience and the price for the shopping experience, I mean, everything that I hear about how they treat vendors, whether you like it or not is they’re close relationships, there’s strong direction, was part of the culture being brought over on the technical side? I mean, was part of the idea that we figured out data for warehousing and shipping to stores and we should figure out how data can be used on the website, was there any crossover there, was there anything you pulled from that culture?
Cliff Crocker: I would say, well, we definitely pulled a lot of things from the vendor management perspective and some of the people there actually really care about on the vendor management side today made it very clear that, you know, our success is depending on the shoulders of all these vendors that we work with, so that culture was definitely there. However, I think that some of the separation between Bud and Bill and, you know, e-commerce side at Walmart and the website itself, there was a pretty big disconnect.  It wasn’t until the formation of Walmart labs that I started to see some that come on and actually teams that were dedicated to data and dedicated to big data, so, that’s where we gotta start playing with people and playing with the cool toys and, you know, the very large city hoop clusters and obviously we got Boomerang up and running at Walmart, started collecting all this rich data and now that is actually driving a lot of what’s going on on that side. It is just about better understanding the customer, being closer to the customer, who is this customer that is shopping at a Walmart versus an Asda in the UK versus, you know, Sam’s Club and that type of a thing, so and then the mobile store gets extremely interesting and, you know, you saw, I don’t know if you saw Dion and Ben talking this morning, but they’re always great to watch and great to talk to.
Joshua Bixby: They are. They have a great interplay between the two of them, like, it’s a good, sort of, on stage gig they have.
Cliff Crocker: We were chatting last night in the lobby and they were finishing each other’s sentences every other word. Obviously those guys have been doing some cool stuff for years, but the things they’re doing and the opportunities they have there in mobile, if you think about the number of people that are in a Walmart store or in some type of a property all across the US at any time with a smart phone in their pocket, you start to connect the dots and see what a huge opportunity that is and innovation that goes on there between those gentlemen as well as the great team that they have there, the mobile team at Walmart labs, you know, it’s fun stuff. It’s pretty cool.
Joshua Bixby: I don’t know if you read that article in New York Times about Target and how they use data…
Cliff Crocker: Yeah, the whole thing about the dad finding out that his daughter is pregnant, yeah.
Joshua Bixby: How did that type of thing resonate in how you guys were using data and thinking about data, you know, and I ask this also from a perspective of SOASTA, I mean, there are things we can find out about people, what they’re looking at, how much they’re spending, when you think of that whole challenge of privacy around this, how do you think about that?
Cliff Crocker: I think, you know, there is a creepiness factor, right? There is always a creepiness factor and a big brother factor thinking about, oh my gosh, you know, just me as a consumer thinking about I don’t know that I want everyone knowing all this stuff about me and, you know, predicting, you know, what I’m doing. I just joined a CrossFit last week and I need to, you know, be buying these products or, you know, longer socks or whatever it might be, but to be honest I think that it’s actually, it’s not an intention of trying to, you know, invade privacy or, you know, predict, you know, all types of things to really maximize revenue, it’s more of a competitive advantage and something that, you know, all companies are having to do if they want to serve the customer better, right? So, I love it. I think that it’s great. I think big data is amazing. I think that what we’re doing at SOASTA or beginning to kind of embark on with Buddy and Philip from LogNormal and me coming in and having experience from Walmart and really delivering that product line, I think it’s not always gonna be about that performance, it’s gonna really be about human behavior and how can we sort of predict what that user is going to do next or what their behavior might be on a site, so we can think more about what should we actually be testing, on which we actually should be spending our time, what should we be optimizing and, you know, what kind of things and behaviors are driving people away. Buddy introduced something yesterday that he has talked about before called the LD50. The lethal dose basically where you guys have the performance poverty line at Strangeloop, out stick may be the LD50 where you’re looking at what point the user starts bouncing or exiting a site when performance gets to that level and sort of understanding that in a broad spectrum, but in a multifaceted way where you’re looking at multiple dimensions, whether it is browser, device type, geography, whatever it might be, it’s just really about creating better quality and also providing more input into that whole development life cycle where functional requirements come in from the business and marketing and all the way through to where we’re supporting production.
Joshua Bixby: Yeah. I saw an interesting startup today on the New York Times, New York based.  Its goal is to allow people to sell their information, so, you know, I could track all of my browsing history and then sell it back to either a conglomerate of vendors or vendors who might want to buy that information.
Cliff Crocker: Wow.
Joshua Bixby: Yeah. I don’t know where it’s gonna go, but I thought, you know, this whole world of privacy and tracking and what we do and how we interact with it, how we time it, it’s interesting and I thought that, you know, it definitely caught my interest. I was like wow, that’s interesting.
Cliff Crocker:  Yeah. Crowdsourcing is very powerful and I think what we wanna do is, well, start to feed all that data back into the industry whether it’s in a way that’s free, that we can all kind of consume it and understand how different industries perform, different verticals perform so we can kind of provide more contextual intelligence and get more contextual intelligence to our own data. It’s extremely powerful and really why I’m doing it cos the questions always change and more things you can do with the data.
Joshua Bixby: So, you have this great gig at Walmart, you’re collecting information, you’re analyzing it, you’re in big data heaven and SOASTA obviously had something pretty attractive. What attracted you, I mean, I don’t think SOASTA is a household name for most people yet, so I guess, give us, what was attractive about it and why, like, you’re a high-value asset, why did you go there?
Cliff Crocker: You know, I think that I had given myself a timeline and said, hey, I wanna do this for a while, but I don’t wanna stay here, I wanna move, I wanna actually, you know, do something that’s gonna have more impact at a global level as opposed to use within one organization, so it was parting on great terms first of all with Walmart, but really the reason that I thought SOASTA was so attractive was back to your point about vendors and partnerships. Before I even get into the technology and the things they’re good at, they were a company with a lot of integrity, you know, when I was dealing with a sales rep it wasn’t dealing with a sales rep, it was dealing with an account manager and a guy that I could call and talk to and, you know…
Joshua Bixby: Who knew something about the product.
Cliff Crocker: Exactly, yeah.
Joshua Bixby: Not just what the discounts were this quarter.
Cliff Crocker: Exactly. Exactly. You know, not how much he was gonna let me beat him up on price or whatever, it was more about getting the work done and just the innovative people there and how much the people really loved working there and really loved the direction of the company. But all that aside and that’s all nice and warm and fuzzy, these are guys are really smart, these guys have done this, you know, the two founders, Ken Gardner, who I report to, the executive chairman, and Tom Lounibos, have been doing this for a while and have had several other successful startups and the thing that Ken, really attracting about Ken and the reason that we work well together was really his history with real-time analytics and data and that was something that they started doing years ago and had perfected with one of their previous companies and we started putting the dots together and realizing, hey, we can start to deal with this data in a real-time way and also do it with some real-time visualizations and the visualization part of it kind of had me sold in terms of, you know, playing with the data, making it beautiful, breaking the mold, you know, not just developing yet another monitoring product, but doing something what could be different and really kind of change the way people look at real user measurement.
Joshua Bixby: And then you guys obviously bought a real user measurement company two days, yesterday. Tell me about that.
Cliff Crocker: Yeah. I might’ve had a little something to do with that. So, Buddy was actually the guy who I brought on to help me to get up and running with Boomerang at Walmart and I, again, met him through the web perf meetup through Aaron’s meetup there and I always loved working with Buddy, but he wasn’t a guy that I was ever gonna be able to hire and he and Philip started this great thing with LogNormal. I loved the way that they were actually, the approach that they were taking with the data, very statistical and analytical approach to measuring data and a sensible way of doing it, so, you know, rather than kind of strike out and, you know, try and find and build a team of people that could adapt and learn about performance and really sort of build that up again, which will take some time and is hard to do, I thought what a great idea to actually go out and talk to these guys and see if your interests are in line enough for them to actually come join us and do this together.  So, it was a process of several months. Taking about partnership, looking, you know, do we do this ourselves, do we, you know, how do we actually win in this space and it just ended up making sense and I, you know, couldn’t be happier about the decision. I’m very excited to work with those guys, very humbled and, you know, when I sit down and talk with Philip or Buddy about their ideas and their thoughts, it’s just exciting and it’s fun again, so.
Joshua Bixby: And, I mean, I’ve met Buddy a number of occasions and I echo your sentiments, smart dude, we’re actually gonna have him on the podcast here, so, people will get to hear for themselves. Tell me about where measurements goes, I mean, as you said, and I’ve always thought this, the market hasn’t changed that much over the last 10 years, I mean, the browsers have changed, but not much else has. We have more simulation of real end users, we’re getting closer to, you know, people are understanding that you have to use a real browser and it’s important to have a location that mimics a real end user and a bandwidth that mirrors a real end user, what’s changing?
Cliff Crocker: Well, I think what’s changing is the fact that simulations, you know, while they have served us well and they have done a good job to this point, there are absolutely no replacement for human behavior and I think that’s really what’s driving this, is that instead of a world where we base everything on synthetic and live and die by our Gomez Keynote web page test numbers, I’d rather live in a world that we’re actually basing it on what we’re measuring from the end user. So, I think leading with that as opposed to leading with synthetic is really what’s changing and how it is going to be different and also, you know, setting some new metrics or coming up with some new metrics that just make more sense.  I think what we’re big on is the fact that, you know, whether you’re at Walmart or, you know, Target or whatever sites you might be as our test go, what’s important to that company is their number and what their goal is and really looking at their own data instead of trying to base their, you know, their studies on something that Google did or Amazon did or Shopzilla did or Walmart did, when I take that same model and really see, you know, what makes sense for our customers and our users, where is the LD50 or performance poverty line for our businesses as opposed, you know, the rest of the industry.
Joshua Bixby: And do you think it’s varied? I mean, do you think Walmart is different than Target?
Cliff Crocker: Well, I think that everyone and every company that I’ve ever talked to or consulted with thinks they’re different and I think that’s…
Joshua Bixby: Are they?
Cliff Crocker: I think that there are different types of customers. I think that, you know, whether you’re…if you’re selling something at Walmart that someone could easily pop over to Amazon and buy because the page is slower and it’s not loading, it’s different than if you’re a specialty store and selling something that someone absolutely has to have from that store, like a cheese shop or something, right? So, I think that they are different. I think that what’s not different is really the fact that we do see some level of drop off, we do see some level of engagement where people are just getting more and more and more impatient or more expecting, you know, to be delighted as Philip would put it, you know, by providers like, you know, Walmart and everyone else, so I think that expectations are growing at the same rate, but I think that behaviors might vary enough and not even just between industries, but, you know, one of the study that we did that you guys have posted on your blog as well and talked about was basically where performance changes is depending on the type of product you’re looking at, that patience or that tolerance for someone who is buying something like an iPad is going to be, you know, much greater than someone who is buying PowerBait or, you know, something else off the site, right?
Joshua Bixby:  Yeah and that’s very interesting, almost within segments of the product line one has different tolerance, right?
Cliff Crocker:  Exactly.
Joshua Bixby:  I know that for myself.  I’m buying a car right now and I’ll spend a good 20 minutes on a page, you know, and I’ll wait for it because I want to see what capabilities and…
Cliff Crocker: A very Flash-heavy page.
Joshua Bixby: Yeah, I mean, I don’t like it, but I definitely wait because, you know, it’s a big purchase and I’m gonna take some time, so I get that, I mean, as somebody who preaches that every second counts, I must admit, in my own behavior sometimes I definitely will spend more time on one thing than I will on the other. That makes a lot of sense to me.
Cliff Crocker: And I think, you know, back to sort of your original point, that it hasn’t really changed that much.  I think that it has and it hasn’t, but also this whole mobile thing that, you know, isn’t just a fad anymore, has really changed the game as well because it has made it harder again for developers to get give a fast user experience that people are expecting, you know, same speeds over carrier lines that they are over, you know, DSL or whatever.
Joshua Bixby: Yeah. No, I know, that’s something I’ve been spending my time thinking about the presenting on, which is how do cell phones work and why are they slow and, you know, that’s definitely a real area of interest.
Cliff Crocker: Right.
Joshua Bixby: What have you learnt at Velocity that’s new, net new, like, you’re gonna take back to the shop and say, holy smokes! Everyone needs to download this slide, anything?
Cliff Crocker: Well, actually, I, embarrassingly, wasn’t able to attend your talk.
Joshua Bixby:  So, there you go.  That was like six of the slides I was thinking, that was like a soft ball for you.  Other than the genius that I presented, any other one or two slides that stick out?
Cliff Crocker: And it was only because Steve organized the track in a way that LogNormal presenting at the same time as you were, otherwise I would’ve been there.
Joshua Bixby: That’s true. That’s true. You’re one of the only ones that has an excuse, although someone else had to do an intro so I figured they had an excuse too, I can’t remember who it was.
Cliff Crocker: I think aside from that, which I’m sure to be inspired by and motivated by…
Joshua Bixby: I love it. This is good. Keep going. I can handle this forever.
Cliff Crocker:  I think, actually, one session I just came out of with Pat Meenan who I’m a huge fan of, he’s my hero.  He was doing his whole presentation that had nothing to do with Webpagetest, but had everything to do with single point of failure and SPOF-O-Matic and I think that’s been introduced, you know, for a while and then it sort of died off and now it’s getting really hot again at Velocity in US and here and something that I can talk to customers about from a performance perspective, in terms of beingready for holiday or something.  It is not just about load testing, but look for these things and so I think SPOF-O-Matic is great.  I’m excited about that.
Joshua Bixby: That’s cool. I was beta testing that early on and loved it, like, I was so excited using it, so I’m a big fan of that stuff too.
Cliff Crocker: Yeah. Absolutely. So, if I had to say there’s obviously something I learn with every talk even if it’s someone who has recycled their slides 12 times and I’m not gonna call out any names or anything, but I think that every single one of those talks…
Joshua Bixby:  We all recycle a bit, but there should be some, you know, especially a conference of this magnitude, you should put some original thought into I think.
Cliff Crocker: Yeah. I was actually, you know, sort of silently referring to Steve’s slides that aren’t so great, but… 
Joshua Bixby: Of all the guys, he is kind of known for the same shtick.
Cliff Crocker:  Absolutely.
Joshua Bixby: It’s like when you go McDonalds, you expect a burger to be the same every time, you know.
Cliff Crocker: But you get something new every single time. I take something new from it every time I hear it or I hear something in a different way or I think of a new question to ask and what I love of Velocity is the fact that it’s the hallway track…
Joshua Bixby: Yeah, we can hear it in the background here, you can hear the hallway track, we’re in the hallway track.
Cliff Crocker:  Absolutely. It’s the networking, it’s the talking, it’s the ideas and the community and the companies that get started and then companies that have successful exits and all circle around Velocity that I think is just amazing. So, it’s an amazing community. It’s been amazing to me, I mean, I have a lot to be thankful for this whole movement or whatever you wanna call it because it certainly has created endless opportunities for me.
Joshua Bixby: No, it’s a good tribe. Cliff, thank you.
Cliff Crocker: Yeah.
Joshua Bixby: Take care. I want you to get out, I’m gonna stop this so you can get out.  I see some clouds over there…
Cliff Crocker: That’s right.
Joshua Bixby:  …kind of in the background.
Cliff Crocker: That’s right, yeah.
Joshua Bixby: So, get out there and enjoy London.
Cliff Crocker: And what was the name of the place again, one more time?
Joshua Bixby: Borough Market.
Cliff Crocker: Borough Market, got it, got it. 
Joshua Bixby: Right near London Bridge. 
Cliff Crocker: Thanks.
Joshua Bixby: That’s my current…I don’t know when it’s open till, but it’s definitely a lunch place.  It’s fantastic.
Cliff Crocker: Okay. Excellent. Thanks for having me.
Joshua Bixby: Okay. Take care. Well, that was great. Thanks for listening and thanks again to Cliff for taking the time out of what was a very, very busy schedule to chat at Velocity. All the links and slides that Cliff and I spoke about are available on the blog,  If you have a suggestion for a future podcast, topic or guest, please drop me a line at  Any and all suggestions are welcome.  I’ve had some crazy ones and I’m trying to book some of those crazy chats and I look forward to more.  Have a wonderful day.  Thanks for listening.

Joshua Bixby:  I am Joshua Bixby, President of Strangeloop Networks.  Welcome back to the Web Performance Today podcast.  This week I am talking to Eric Goldsmith, Operations Architect at AOL.  Eric is a performance evangelist in his public life and in his work day life, he is in the extremely enviable position of being able to leverage large scale analytics technologies to collect and analyze data from AOL website, something I am quite envious of.  He has been a pioneer in mining this data and it was a pleasure to talk to him about why we have to think like data scientist, why teaching stats in a corporate culture is an uphill battle, and how the RUM world has changed in the past seven years.  I hope you enjoy. 

Joshua Bixby: Here with Eric Goldsmith, architect at AOL, long time architect.  Eric how are you doing?

Eric Goldsmith:  Good.  How are you?

Joshua Bixby:  I am wonderful. You are one of the unsung heroes of our community as someone that has probably more cumulative experience in big data and performance than most of the people listening probably altogether. Tell me about your experience at AOL. Tell me about the last I guess 9 years.

Eric Goldsmith:  I’ve been here 9 years that’s right and during that time I have done a lot of work in the 2005 and 2010 time frame with web performance and building up tools in data collection and so forth to bring visibility of that throughout the company and of late, I have been working more on the big data side of things.  You mentioned that I am an architect.  I am in data technologies organization with AOL and we collect lot of data from both synthetic measurement tools as well as end user instrumented tools like real user metrics.  We collect all that data and provide the analysis and reporting services to provide insight to the business.

Joshua Bixby:  And you are one of the original web page test guys.  Tell me about the birth of that.

Eric Goldsmith:  Well, it was originally an internal tool that kind of grew out of the modem days when AOL was modem based and it was actually originally called a ward dialer because it was literally dialing modems.  Then in about the 2005, 2006 timeframe things transitioned, that tool was transitioned to be more web focused, web facing, and it was built out to support the extra information that was able to be collected from browsers, the waterfalls, and so forth, as well as multiple geographic locations to be able to get that additional insight.

Joshua Bixby:  So who were the main guys? I mean I know you are obviously one of them, Patrick is.  Who else would you sort of put the mantle around?

Eric Goldsmith:  Here well there were three musketeers; Patrick Meenan, myself, and David Artz.

Joshua Bixby:  David Artz, that’s right of course.

Eric Goldsmith:  Yeah.

Joshua Bixby:  And you I do not know if I said this to you before but you are the best looking face web page test has ever had.  No offence to Patrick but you know when you got up there at Velocity and spoke I think all the ladies, the two ladies were swooning.  Patrick has no ladies in his sessions as far as I can tell.  The two ladies go somewhere else.  There is an article that came through one of the Twitter feeds, which was something about controlled experiments and puzzling outcomes.  Was that you?

Eric Goldsmith:  Yes, I re-tweeted that.

Joshua Bixby:  Re-tweeted it.  It is a Microsoft study right because it is interesting.  It makes me think that we are thinking about this idea of RUM and actionable metrics, that one was fascinating because that is actually something we run in to a lot which is these things are sometimes counterintuitive, did that speak to some of the things you do in your job?

Eric Goldsmith:  It does and one of my mantras is – a lot of people when they think big data, to bring it back to that topic just momentarily, they think boy, I’m just going to collect all the stuff I am throw it in a hadoop cluster and then just go and analyze it and start looking for insight and that’s really not the right approach in my opinion.  It is a starting point that type of data mining that I call it, is great for establishing corollary evidence or correlations but doesn’t necessarily result in causation, it can't prove causation, so it's a starting point.  My little mantra is, mine the data for correlation and then experiment for causation.  So once you started seeing some correlations that look interesting, then you've got to really start doing experimentation, you get into – putting together hypotheses, looking at what data you need to prove or disapprove the hypotheses, ensure that you’ve got that data, or that you're going to be able to get that data and then collect it and analyze it with the hypothesis in mind.  So it’s a lot of the right way to do this stuff is a little more sophisticated than a lot of people are led to believe or want to believe.

Joshua Bixby:  I do not know if you follow the podcast, This American Life, so they had an episode recently about – I think it was called So Crazy It Just Might Work.  I do not know if you've heard but the one about the cancer researcher where the guy had what was the machine the one that hit it with sound waves that starts with an R, I cannot remember I am trying to think.

Eric Goldsmith:  I cannot remember either.

Joshua Bixby:  So where the musician and the researcher got together and it actually reminded me of what you are saying because the researcher kept saying, this cancer researcher kept saying this is science like you have to control, you have to experiment.  The controls have to be right.  It has to be reproducible and the artist was saying well I saw cancer cells die.  I mean this is – I believe it but it actually reminds me of how data science in our world is becoming so important and the approach that data scientist take is very different is very different than the approach that the data artist takes right and it just reminded me – that was a great episode by the way – but reminded me of that idea, which is we have to be scientists, create hypotheses, look at the data, and do experiments. But one of the things that has always struck me about Velocity culture in particular but I think this extends out to the culture of the large tech giants which is you with ease have cluster of 100 Hadoop servers.  You with ease have segmentation platforms and you easily can take 1% of your traffic and get statistical viability in 12 seconds or whatever the Facebook guys always talk about, but that’s not actually accessible to most people. 

I mean most people have a hard time even getting the basics off the ground.  So if you were to sort of extract yourself out of the AOL world and I know that you have worked in other areas but if you were talk to an architect at a dot com or someone that has doesn't have, you know small dot com or medium sized, what would you advise them about this idea because it comes off your lips as if you know let’s experiment, let’s collect all that data well that is easier for you.  I know it is not easy but it is easier for you.  If you got transferred in to the body of a CTO at a top 200 retailer in the mid-tier, what would you do?

Eric Goldsmith:  That is a great question and I do not really have a great answer.  One of the reasons that I like working at AOL is the ability to work at this scale, I revel in this scale.  When you don't have that scale things are more difficult there is no doubt about it but I still think the data is there.  The ability to get statistical significance is there, all the fundamentals are still there it just will take a little longer to achieve the goals because the volume is lower but the fundamentals are still same.

Joshua Bixby:  How much do you struggle with teaching your non-techie executives to be stats experts. I mean you know some guy has great idea that, hey let’s do ads this way, let’s do this thing that way let's say culturally not necessarily you and they look at 20 minutes of data and they have already made up their mind where statistically you need a day, you need a week, and you need a month, do you face that battle or has your culture learned how to deal with stats?

 Eric Goldsmith:  No they have not.  I think it is a problem that is universal and it is a challenge to bridge that gap just to be able to understand the technical side and the appropriate depth to do things right and also communicate that and convince people on the business side that you do need to do it a certain way that you do need to run a test longer than 20 minutes in order to get significant results.  For me I’ve been fairly successful just trying to articulate that in a non technical way with examples and so forth.  Usually the business people will get it.  There is going to time pressures no matter what sometimes you just cannot do what you like to do from a purely scientific standpoint but it is a case that has to constantly be made and re-communicated.  It is a not a problem that is going to go away any time soon.

Joshua Bixby:  No I am feeling the same pain.  I want to circle back to one of the topics we talked about earlier around analytics in the future and you know something like Google analytics which now has some RUM in it. Do you see that that is a path for how RUM will be utilized,  do you think that that will be part of an analytics tool? What are your sense on that?

Eric Goldsmith:  There is definitely a lot of promise there.  Google analytics makes a lot of things a lot more accessible to people than they otherwise would be.  It hides a lot of the complexity of the tooling and people do not have to learn that or focus on it and use a fairly well designed UI to get the data but with that simplicity sometimes there is glossing over of some of the important details with page load times for example  there is not lot of information about how they collect that data.  Sometimes it is collecting toolbar sometimes it is from JavaScript timers on the page, sometimes it's from navigation timers.  A lot of the segmentation that we talked about earlier actually collecting the additional data needed to do the appropriate segmentation may or may not be being collected the results are presented in very aggregate ways like averages without any regard to distribution and so forth so it's a good start it makes the data lot more accessible and it is getting people to think about it but I do not think it is quite mature enough, it's not where I would like it to be and I don't know if it will get there.  I don't know if the Google analytics audience is one that would appreciate that extra level of diligence.

Joshua Bixby:  Yeah I would love the number of feature requests they get and how much of it relates to this because that is always been my feeling.  I just did a blog post about this a few weeks ago where I basically disregarded Google as a RUM tool in the sense that of the people I interviewed no one actually is using it for RUM.  People said I have seen the screen once or I have gone to it occasionally but I have not seen a critical mass there at least among the customers we work with that are taking it seriously as a RUM tool which you know if no one is using it I would guess the feature set is probably limited but I would love to get some insight in to that.  Tell me about as you look across the last 10 years of data science what's changed?  And I ask this because I am so fascinated by your perspective because I have always thought of AOL and Yahoo as sort of the two harbingers of the future in terms of guys who started to take data science seriously early, organizations that you know from a media perspective and hype perspective have been passed by but still remain the bastion of sort of the roots of the entire community.  I mean everyone that has done something important or the vast majority has come out of one of those two schools of thought so as you've looked through this history and see from the perspective of one of the early adopters has anything changed I mean you guys collected RUM seven years ago what has changed?

Eric Goldsmith:  Well in my mind what has changed is the tooling has gotten sophisticated enough and has grown in that 10 years to allow the entire set of data to be stored and entire set of data to be analyzed not just samples.  In the past when you had to deal with samples or small data sets and tools that could only work with small data sets you had to do lot of sampling and lot of more statistics a lot of how the way you do statistics on samples sometimes defers than if you have the entire data set your approach changes.  Being able to work on the entire data set have the entire data set at your disposal allows you to dig into the outliers and allows you to dig in to the tail and allows you to get more insight of the data in my mind.

Joshua Bixby:  So that is a huge difference and I would agree.  I mean I just bought one terabyte drive just a little drive for my computer.  It cost me 50 bucks and it makes me feel old but I am still amazed about that and I know really one terabyte drive actually is a lot less but is a nice little portable one and has a fancy case, but I am still amazed at the cost of data.  I mean I can’t get over that as a business owner who has paid for data for now almost 15 years.  We used to ask our guys to delete files off disks to save space.  I haven’t done that for 5 years.

Eric Goldsmith:  I still, I am like you, I was brought up in the time when storage was obscenely expensive and its just that habit of deleting stuff and keeping things under control is just so ingrained in me that I still do it regardless of how others feel.

Joshua Bixby:  That’s funny, see I can’t say that.  I have a MacBook Air and I got a warning, this is embarrassing, I got a warning last week that I am out of disk space and its like 250 Gigs and I am thinking what could I possibly have and it is 40 gigs of photos and it is 40 gigs of videos for watching on the airplane and it is 10 gigs of email, I mean I am that guy.  I am the guy that says I love my MacBook Air but I wish I had a terabyte drive on it.  So I can’t say that I have that same rigor.

Eric Goldsmith:  That brings up another good point though about what’s changed and that ability to store all that data.  It is not just having all the data to do the analysis but it is having the history to go back and look, so if we want to compare something that happened today or a change in a product or whatever it is, with data from a year ago, 2 years ago, 3 years ago, we have got all that data because it is so cheap to store it.

Joshua Bixby:  So the data scientist in me which I am not a data scientist, but I am going to adopt that tribe for a while, because I think it is a pretty cool tribe to be in, the data scientist in me thinks that is cool, and then another side of me thinks, how often are you going to do that.  The pages are so different, the functionality is so different.  The browsers are different.  The networks are different.  You know one year to the next, does it really help you?  Do you find that these correlations help your business other than, oh, isn’t it interesting, because I get a lot of stats as a business owner.  I get a lot of stats, isn’t this interesting, I am like, it is interesting, but why do you spend 3 hours on that, so is it actually helpful?

Eric Goldsmith:  You raised a valid point, you raised a valid point.  There is so many things changing all the time and depending on the metric it may not be, it may be not be relevant to look back a year ago and try to compare the results; that’s a good point.

Joshua Bixby:  Are there examples of where it is relevant, can you think of examples of where, let’s talk about a year, because in our world that’s a long time.  Are there examples of where a year comparing something today, to a year ago to actually make an actionable business decision other than, oh look people are using different browsers and look it was slower, are there specific tangible examples you can think of where that has been helpful?

 Eric Goldsmith:  Oh lot of times we look at usage of a particular site over time.

Joshua Bixby:  Okay, that makes sense, yeah.

Eric Goldsmith:  Being able to compare year over year or even further back as you spoke.

Joshua Bixby;   Yeah, this Thanksgiving this happened and so…

Eric Goldsmith:  Exactly.  It is helpful for capacity planning, it is helpful for just historical usage trends and so forth.

Joshua Bixby:  Big things in the New Year.  What are you seeing?  Any bold predictions for me?  Anything I wouldn’t expect?

Eric Goldsmith:  No.  I can’t think of anything.

Joshua Bixby:  Ah you have been in this world too long.  You know all those bold predictions don’t come out.  I have a blog post, I write at the end of every year with bold predictions and you know at least three quarters of them don’t come to be true but it garners some readerships, so I am digging here, come on give me something.

Eric Goldsmith:  Well one of the areas that I am really spending time on and interested in is again collecting the real user metrics with enough data to allow me to properly segment, clean it up and make it actionable and being able to get access to that data, be it nav timing, be a resource timing, API, the other things.  Having that available is going to be a game changer and as you know nav timing is available on a lot of browsers now notably missing from Safari.

Joshua Bixby:  I know.  Let us just dwell on that for a second – Safari, Steve’s dwelled on it for a while, okay now we have our seconds of mourning, let’s keep going.

Eric Goldsmith:  Right, so hopefully I'd love to see Safari join the game next year and give access to that data getting even more granular with the resource timing being able to look at it in more depth and just at a whole page load level but at individual resource load times and getting into other browsers, it's in IE 10 now on windows 8, I would love to see that in all the other browsers and just being able to get access to all that data, all that additional data that is needed to make RUM really actionable in my mind, that is really what I am looking forward to.

Joshua Bixby:  That’s a good one.  Now it is funny that you mentioned that the resource timings.  If I was an executive at a CDN I am probably really nervous about that one.  You know you and I have both used CDNs extensively and we know that if a tree falls in the forest and no one is around to look or hear sometimes it is not heard but you guys probably have a lot of insight into how CDNs work and don’t work other than no naming specific companies but you have probably seen that a lot of those resources aren’t at the edge or they are not served quickly or there is a problem on that server, I mean, this is the resource timings I think is really going to shed a lot of light on CDN usage.

Eric Goldsmith:  One of the dirty little secrets in the CDN world is generally the way they work is that the servers at the edge, CDN server is on the edge, a request comes through them, they pass that through to the origin server and then cache it, or whatever the cache headers in the origins say to do, but if you’re following Sauders rules, it is at least 30 days, so it’s still in that server for 30 days and will be available for anybody else who needs it, except not really, because there is only a limited amount of space on those servers, so stuff get evicted, that you just assume it is going to be there for 30 days, but not really, and having insight into how much is really available on the edge, how much is really there versus what you think is there, that is going to be interesting.

Joshua Bixby:  And I think expectations are certainly skewed by the fact that all these guys have really figured out how to get great test results, right.  I don’t like to use the word gaming, although I have used in the past, but if your server is on the same network in the same building sitting next to the Gomez and Keynote servers and you pin content to that server, your test will look good for a month but when that pinned content needs to go to the next customer that is going to sign a 2 year deal, you know it doesn't look as good over time, so yeah, I think resource timings is going to be great for end users to start figuring out and really holding accountable to CDNs to having their content to the edge like they promise, so I am excited about that  one as well.

Eric Goldsmith:  Right. But something else that I think would be interesting in that data, in that resource timing data is the cache freshness or the cache ship rate at the end user's browser.  This is where it kind of gets fuzzy, because there is some privacy concerns there.  Because if you can tell via the timing results that something came from a user's cache then you know they have been there before and that's maybe crossing the privacy line.

Joshua Bixby:  Yeah.  I mean certainly my European friends with their cookie insecurities would probably say that but I mean in our world and you guys and many others you know everything about where that browser, you know almost everything about who that person is and what demographic they are in, and I mean, it doesn’t cross it anymore than existing retargeting technology or ad technology does today, right?

Eric Goldsmith:  Well, in my world, I am also involved in the Do Not Track initiative for AOL and you know that working group that you see and so forth.  I am involved in those aspects those privacy aspects of this data as well.

Joshua Bixby:  It is a pleasure to chat again and as I say you are one of these unsung heroes that needs to get more profile, so are you going to start putting new topics at Velocity, can we get you out to some of these conferences, because you have gone quiet since what, 2010.  You stepped off the circuit man.

Eric Goldsmith:  Well, let me explain what happened there, I got kind of pulled sideways into big data, and so I have been doing different conferences, Strata and other things, but I got there because of RUM data, because when you are collecting that volume of data, you need some mechanism to analyze it and Hadoop and Pig and all of the stuff that goes along with the big data world and so I got kind of pulled sideways into that.

Joshua Bixby:  And that Strata stuff is amazing, I mean I am good friends with Alistair Croll and some of the guys that spend their time being evangelical about that world and man that is as exciting or even more exciting than the world, obviously these two worlds mesh as you say, but that is a really exciting world, I think that stuff is amazing.

Eric Goldsmith:  It is and I am hoping to this coming year 2013 kind of bridge the gap between the two and move back a little more, be a little more visible in the web performance base with RUM work than I am, time will tell. 

Joshua Bixby:  You should, we need you back here.  No we need you back here, because there is not enough going on, there is not enough big data science being spoken about given that those things have started to become two distinct communities like Ops and Dev traditionally have and then these conferences come along to try to bring them together.  We can't let big data run off on its own and be divorced from the Dev community, the operational community.  Not that I welcome anyone, but I would welcome your continued involvement.

Eric Goldsmith:  Well thank you.

Joshua Bixby:  It is wonderful to chat, have a great day.

Eric Goldsmith:  Thank you. Great talking to you.

Joshua Bixby:  Thanks, take care.  Thanks for listening and thanks again to Eric for making the time to chat.  If you want to hear from some other big thinkers in the performance space check out and if you have a suggestion for a future podcast topic or guest drop me a line at  Have a great day.


Where to find Eric:

Mentioned in this podcast:

Subscribe in iTunes: