Our fiscal year end is almost here, and our Eloqua implementation is just over a year old. The good news is that we went from a pre-Eloqua response rate of .5 – 1% and hit 8% on a regular basis and have settled around a 4% average for all of our regions!
I expected to hit 2% at best. Is it the technology or have the people changed because of the technology. I would say it’s a little of both. Technology empowers training, and because Eloqua is a shared system, all of our marketers can learn about the results of other regions. They then take that program and apply it for their region.
Another good outcome is that many regions work together to develop a program and they naturally migrate to the most successful programs. With the implementation of Eloqua we did mandate specific standards to limit the variables that might effect response rates. This has proven to be useful, because we’ve narrowed the variables to the list and the offer. Rarely is the creative the issue as we develop the creative templates centrally. So when we see consistently low response rates from a region we know to first examine the list, the offer and it’s relevance to the audience.
- Limit variables.
- Track and report results to the nation (we provide a monthly report to senior management and marketing directors so they learn from eachother).
- Recognize excellence – we’ve developed a 6 month and year end recognition awards program for the best campaigns. On a monthly basis we recognize marketers with strong results and review why others failed.
- Train, train, train on best practices.
- Set goals.
- We require a training and taking a test to earn “Power User” designation. This helps add stature to the program.
At the end of the day, I would say that what has changed the most is the people that are using Eloqua. I would feel pretty comfortable that the “power users” could move to direct mail and would apply the best practices, that apply, that they learned from using Eloqua and being part of the whole program.