Editor’s note: Joe Hopper is president of Chicago-based Versta Research.

It wasn’t long after the shock of Election Day that a colleague asked, “What do you think about the validity and accuracy of surveys and polls now? I'd say they’re all hogwash.” She was not alone. “The vitriol targeting pollsters in the last few days has been intense and ugly,” wrote another colleague via the AAPOR online discussion forum.Most of us in marketing research work outside of election polling but of course any survey research is related to public opinion polling and our methods are the same. If election polling provides the proof that survey methods work, what are we to make of Donald Trump’s 2016 win being a surprise?

Here is my take: This election put research methods to the test and subjected them to public and professional scrutiny like never before. There is much to learn about what works and what does not. People still can’t believe how wrong the survey methods were. As an industry, marketing researchers can learn from these reactions. Some things went right, and some things went wrong. But what?Let’s look at five lessons to draw from the 2016 election polls.

1. Surveys work. And they work extremely well. This may sound ridiculous in the wake of pollsters’ failure to predict Trump winning the White House but the polls did not fail. It was the attention-hungry people who interpreted, reported and prognosticated based on the polls that failed, and they failed miserably.

Clinton got 48 percent of the national popular vote. Trump got 46 percent. Clinton won the popular vote by a comfortable margin and nine out of the 10 top polls correctly predicted this. On average, the top 10 polls had Clinton winning the popular vote by 3 percentage points. She won by 2 percentage points.If you do not find this remarkable, you should. Despite the enormous challenges polling faces today with plummeting response rates...