A few weeks ago I attended the Analytics 2012 conference sponsored by SAS. I enjoyed much of the content and want to highlight just a couple of sessions.
Using Data Mining in Forecasting Problems
Tim Rey's keynote session provided specifics around what an analytics department looks like and what they can accomplish. I appreciated the detailed, real-world examples he presented that were direct from his work at Dow Chemical. I found his presentation covered a good mix of how his department is structured — number of people, skills, roles, etc. — and the types of modeling projects they have completed, including very detailed components of their work using SAS Forecast Server, Enterprise Guide, and Enterprise Miner. Tim focused primarily on the details of the models and forecasting work they had done, and rightly so. But I appreciated that he highlighted that the data side of analytics is "hard work and takes time." Amen.
Design of Experiments
The title of this presentation captured the essence of Andy Pulkstenis's content: "Do You Know, or Do You Think You Know?" Andy walked through examples of State Farm's testing of various features on their website. They certainly have the benefit of significant web traffic upon which to base their tests, but the principles can be applied to smaller situations as well. I thought Andy did an excellent job of demonstrating the power of focused, detailed, statistically-sound DOE procedures and then taking a step back when interpreting the results and making decisions. For example, the #1 result from a set of experiments may not be the most optimal to explain or implement, but you can still dramatically improve significant business outcomes by comparing the "Top Best to the Top Worst." This approach shows the business the benefit of collaborating. And really, if the decision-makers have 3 choices presented to them, all of which are significantly better than the pre-DOE choice, how much does it really matter which one they pick?
Understanding Time Series Data
First of all, I thought this was the best presentation of any that I attended, in terms of the teaching skill and coordination between the two presenters. Meredith John and Udo Sglavo (both from SAS) used a good combination of screen time with the product and audience interaction. SAS Time Series Studio obviously took center stage, but I still managed to feel like the content was platform-independent. The two key points I took away: (1) Structuring time series data is as important as the techniques used to forecast; (2) Understanding the distinction between time series data and time-stamped data.
Nearly every session I attended added insight and valuable perspectives. The less optimal parts to the conference? One session (Buy Netezza!) was led by a rep from a conference sponsor (Buy Netezza!) that ended up being one long commercial (Buy Netezza!). The session abstract clearly disclosed this "feature", but the presenter demonstrated the technical expertise and had several opportunities to dig deep into the hardware and chose not to (Buy Netezza!). A couple of other sessions had good content that was overshadowed by presenters that just read the PowerPoint slides. The same PowerPoint slides I received on the USB drive at the start of the conference. Sigh.
Here are several additional nuggets from some of the other sessions I attended:
- "Determining the reason for a missing variable is more important than how to adjust for the missing variable"
- "We spend a lot on hardware and a lot on software. We need to invest in our users."
- "Productization of analytics is key"
- "A specific analytical result may not lead to better predictions, but it may provide a better customer experience"
With so many competing conferences and events, I'm not sure that I'll attend this conference every year. (I passed on Tableau, Lucene, and a couple other events this conference season in favor of Analytics 2012.) But the 2 days of sessions were, on the whole, productive, insightful, and motivating.