tag:blogger.com,1999:blog-32202893.post5547646516391111935..comments2023-10-18T08:32:17.510-07:00Comments on Kevin Hillstrom: MineThatData: Making Decisions In The Catalog/Online BusinessUnknownnoreply@blogger.comBlogger3125tag:blogger.com,1999:blog-32202893.post-83502114530637085422007-04-10T10:25:00.000-07:002007-04-10T10:25:00.000-07:00If your job is in the analytics field, you have no...If your job is in the analytics field, you have no choice but to do great work, with integrity, and you have to have an analytical answer to a problem that is accurate and reliable.<BR/><BR/>As you move from analyst to manager to director to executive, you have to slowly give up on some of the analytical rigor, in order to be able to make more decisions. You trust that your analytical people did the rigor necessary to have a good answer --- but a leader must make a lot of decisions on gut feel, experience, and instinct. The leader will take a small sample of data that may not be valid from a statistical standpoint, and will draw conclusions from the limited set of biased and imperfect information.<BR/><BR/>So, I'd require the analyst to do a thorough job. As a leader, I trade-off some of the rigor, in order to be able to make more decisions that move the business forward, realizing that some of my decisions will be wrong, and will set the business back a bit.MineThatDatahttps://www.blogger.com/profile/14014200122021988374noreply@blogger.comtag:blogger.com,1999:blog-32202893.post-35428568728224752052007-04-10T10:14:00.000-07:002007-04-10T10:14:00.000-07:00Fantastic post Kevin and Graham, very interesting....Fantastic post Kevin and Graham, very interesting.<BR/><BR/>How would you handle analytic integrity when living by the 80% rule? In this numbers savvy industry, it sometimes feels like everyone wants to be an analyst. Each department has employees who routinely analyze data, reach conclusions, and make recommendations -- often times incorrect conclusions or casual assumptions that are quickly dismissed by your stats women and men. <BR/><BR/>In our analytics group we've built a reputation of producting a scientifically valid and statistically correct basis for action. When we move towards the 80% rule (believe me, I like the premise) we risk compromising the integrity and trust that has been established. We move towards becoming another "I think" instead of an "I know."<BR/><BR/>How do you balance the two? What if we asked less questions but whole-heartedly went after the statistically sound answers we have?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-32202893.post-31002459087606102082007-04-10T00:26:00.000-07:002007-04-10T00:26:00.000-07:00KevinDespite my challenge to your earlier post, I ...Kevin<BR/><BR/>Despite my challenge to your earlier post, I agree with you 100%. <BR/><BR/>As a CRM consultant who is also an interim CRM manager, I am in the rare position of having to eat my own campaign dog-food. That leads to some interesting internal struggles.<BR/><BR/>The CRM consultant side of me pulls me towards robust marketing experiments with careful designs, control groups and 95% confidence intervals. But the interim CRM manager side wants to run lots of campaigns, to adapt them on the fly and to well, just get on with it.<BR/><BR/>My own way around the internal struggle is to develop as much of an understanding of customers, the client's marketing capabilities and the results it is expecting, and to use these systemic insights to drive forward a number of 'best-guess' activities. Each of the activities is a small real option that I can use to learn more about what works and to improve it the next time round.<BR/><BR/>As you say, it is often better to do something approximately right (and to learn from it) than to do nothing precisely wrong. But it is important to at least understand how the world works a bit before venturing out into it too boldly.<BR/><BR/>Graham Hill<BR/>Independent CRM Consultant<BR/>Interim CRM ManagerAnonymousnoreply@blogger.com