SKI Test 2016: Behind the Scenes - Ski Mag

SKI Test 2016: Behind the Scenes

We caught up with our SKI test director to find out what goes into the most trusted ski test in the industry.
SKI Test 2016: Behind the Scenes tout

Starting Monday our editors and more than a dozen rippers will tear up Deer Valley Resort for our annual ski test. For five days, testers will ski a combined (roughly) 150 skis with the final results being posted online and in SKI’s Buyers Guide.

We’ll be Instagramming, Tweeting, Facebooking, live streaming, and making our presence known. So follow our feeds, sit back, and wait for the results to flow in.

But unlike years past, we’re in a different location this year: Deer Valley. So we caught up with SKI’s test director, Joe Cutts, to find get a behind-the-scenes look at one of the most trusted tests in the ski industry.

So Deer Valley, eh?

Yes, can’t wait. We’ve tested there before, but this time we’re moving our base over to the Empire area where it’s a little steeper and more diverse than the old days at Silver Lake. We’ll get to sneak over for runs on Daly Chutes or Lady Morgan. Plus we like to get off-campus at night, and there’s nowhere more fun at night that Park City. Personally I have to admit that I’m excited to have lunch every day at the Empire Lodge…that place is awesome. Also very psyched about the famous Seafood Buffet. Can you tell I like to eat? 

Talk about the logistics: how many testers, how many days, how many skis, etc.

I have to manage test entries in a way that ensures we end up with about 150 test models—about 60/40 men’s vs. women’s, because that reflects the ratio in most brands’ collections and in the marketplace. So we’re ultimately limited by how many skis our male testers can get through in a week, and that’s about 90. We have a pretty rigid test format and insist that every tester ski every ski. That’s so you don’t have one ski succeeding because it was only skied by testers who are predisposed to like it, or because it only got skied in the morning when the snow was better, etc.

What’s the thought behind not having average skiers test?

We get asked that a lot, mostly by average skiers lobbying to get jobs as testers—I don't blame them—but also by readers who say “real skiers” like them would be better in-tune with what they want. But first-time testers tend to be overwhelmed when confronted by so many skis. They bring brand preconceptions to the process and don’t have the experience or vocabulary to say something intelligent about every ski when the day is 15 skis long. But our testers are experts. They know that a ski built for rippers should behave differently than a ski for intermediates. And experience is what makes a tester good at testing. I see that all the time. Plus, I remember what it was like to be a rookie tester myself. 

Some people say only advertisers get good reviews. What’s the deal?

We hate hearing that, and it’s not true. I’d challenge anyone to find a direct correlation between the two, and I’d show them plenty of examples where testers were most unkind to valued advertisers. We let the data do the talking. There’s no manipulation of it whatsoever. And we publish the data, so that even among winning skis, you can see which skis were No. 1 and which skis barely made it. Readers can’t be blamed for being suspicious. We know there are so-called pay-to-play tests out there, where manufacturers are charged for participation. They expect, and get, favorable reviews in return. But even when advertisers are angry with us, our bosses, to their credit, stick up for us and encourage us to call it as we see it. If readers don’t trust us, they won’t read SKI Mag, and that’d make the job a lot harder for our ad reps. Heck, ads have been canceled by angry brands in the past. Seems like every year there’s a brand that’s really mad at us.

Does everything that gets tested get reviewed?

No. Believe me, there are skis that elicit some pretty negative responses from testers—often very colorful and hilarious comments, in fact, and sometimes we run them with models names withheld to protect the innocent. But mostly, if we don’t have something nice to say we don’t say anything. The brands all spend lots of money and jump through a lot of hoops to bring their skis and participate in the test, and if their skis don’t do well we’re not going to add insult to injury by writing negative reviews of skis that don’t make the cut. 

There must be a thousand different models. How do you select the skis that get tested?

We work with manufacturers to get all the coolest, newest stuff in the test. We force them to have skis in all the categories, so we get a good variety of flavors. And in the Value category they have to hit a predetermined price point to, basically, show us how good a ski they can make for a skier who only has $500 to spend. Not every brand gets the same number of skis. We look at market share, and big brands get more test entries than small brands. But we also look at past performance, so that a brand that does well in the test is rewarded with more entries. It’s controversial, and I have a lot of difficult conversations with brands that want more skis in the test, but we think it ensures that we’re reviewing skis that aren’t just good but widely available. 

What’s the best and worst part of the ski test process? 

Announcing the results is the worst. Some of those phone calls (to the brands) are easier than others, but some can be tough. And the best part? Test week itself. Once all the details are squared away, it’s time to ski and hang out with a few dozen fun people at a great place like Deer Valley. Life is good.


How We Test Skis tout

How We Test

Last spring our testers logged about 5 million vertical feet in Snowbird's Gad Valley to evaluate 2014-2015’s new skis, which is why ours is the most trusted, legit test in the world. Here's how we do it.