I’d intended to title this post “more twists in the Jupiter tale” but I think the actual title is far more appropriate.
Quick recap: Last month, Jupiter Research published a study about corporate blogging that led to Toby Bloomberg asking questions about their research methodology. She received a brush-off from Jupiter’s PR agency. Fard Johnmar purchased the survey report and also asked questions about the research methodology. He, too, got the brush-off. His post concluded with a recommendation to treat the research with a pinch of salt and not buy the report.
My own commentary on this story a few weeks ago attracted quite a few opinions from readers of this blog on Jupiter’s behaviour and that of their PR agency. A particularly thoughtful view on that latter point came from John Mims:
[…] I think that the agency does have to shoulder some blame. I would hope that the agency would have at least anticipated some of the questions from the media. Number one question would have to be: How did you conduct the research? As someone who has managed a fair amount of research, that is the basis of the believability of any research project. If I canâ€™t provide that basic information *to protect my reputation* as well as that of my client, I am compelled to walk away. Our responsibility as public relations pros is to provide accurate information for our clients to various publics. If that information is not accurate and we know it, we ruin our reputation and that of our industry. In this case, this agency might have provided accurate information, but they have done so in a way that it makes the information look suspect. They should have better advised their clients or offered a referral to another agency.
Ian Betteridge did a good job as devil’s advocate, standing up for Jupiter and the agency in questioning whether bloggers like Toby and Fard had really done enough in reaching out to Jupiter before posting criticisms. Fard’s response makes it clear to me that they had (and majority view in the comments: yes, they had.)
The latest opinion added to that discussion came yesterday from Clyde Smith:
[…] As a highly trained qualitative researcher (PhD, OSU, 2000) Iâ€™m deeply disturbed by this whole affair. Iâ€™d like to get into more commercially relevant research myself so seeing a major research firm undermine the credibility of such research is quite disturbing. Itâ€™s been interesting to read the comments about the agency and Iâ€™ve gotten much more interested in the relationship of research and pr. But doesnâ€™t any company have to take responsibility for the services they outsource? I donâ€™t see how Jupiter Research could get off the hook at this point in time. […] Deep research always produces a lot of maybes and the media, politicians, etc. donâ€™t handle that very well and remove the nuances that they donâ€™t understand or canâ€™t fit into a soundbite. In the process they typically misrepresent the research. Itâ€™s a mess and itâ€™s sad to see a research firm contributing to the situation in this manner.
I agree with Clyde’s last point. No matter how you look at this story, the way in which Jupiter Research and the PR agency have reacted to questions about the research simply raises more questions. Those questions, and the people asking them, aren’t going to go away. It places Jupiter’s credibility on the line, in my view.
It gets worse, though.
Yesterday, Toby posted an email exchange between her and David Schatsky, president of JupiterKagan (the owner of Jupiter Research), in which Mr Schatsky effectively ignores the focus of Toby’s questions – asking about the research methodology – and, instead, goes on the offensive to accuse Toby of misleading her readers.
I’m astounded at this behaviour. Mr Schatsky has a great opportunity here to engage with an influential blogger, provide the information requested and so open up a further opportunity for far more relevant commentary.
He did, however, provide a snippet about the research methodology:
[…] I’ll tell you that some of the data cited in the report you are discussing and mentioned in our press release is from a survey of 251 executives from a variety of industries who make decisions about their company’s Web site spending and who work at companies with $50 million or more in revenue.
Why stop there, Mr Schatsky? Come on, let’s have the whole story! Add to that little snippet. Let’s try and make your survey’s claims a bit more credible. Or not, as the case may be.
But perhaps this is nothing more than a mess, as Clyde said. If so, Jupiter’s doing a good job at adding to that.
251? From such a large universe? No wonder they aren’t releasing any more data. As I mentioned before, I’ve managed some research projects. Usually I work with Brian Mahoney at Percept Research to conduct the research (and to make sure everything is reliable, etc) so I gave him a call to talk to him about it. Hopefully, he’ll join in our discussion here.
Without any more information about the methodology, there is no way that we can assume that 251 respondents is near enough. Here are some top-of-the-head reasons why:
– We have no reason to believe that they interviewed 251 executives from *different* companies. For some of the companies that take in more than $50 mil in revenue, it’d be easy to find 50 interviewees in a single company. Jupiter knows better, but how are we to believe otherwise?
– Was their sampling of location proportionate to the number of large companies across the nation or are they all from the West Coast?
– “Variety of industries” is much to vague. For such a small sample from such a large group I’d be surprised if they had enough variety to make the study meaningful.
Indeed, John – no wonder they aren’t letting on about how they conducted the research.
The research and, in particular, the press release announcing it appear to be a smoke and mirrors PR stunt if a) Jupiter will only tell paying clients a bit of the background and b) robustly refuse to disclose anything publicly.
Katya Andresen’s July 11 comment on Toby’s post refers.
They are undermining their own credibility. What other conclusion can you jump to otherwise?
Jupiter’s strategy to tease for new clients through the use of news release was a stupid idea, and it appears to be blowing up in their faces. After hearing this exchange of blog posts across the blogosphere, I would imagine many potential clients backing off fast. They’re probably asking thinks like: (1) Jupiter makes bad decisions to get clients, do they make bad judgements in their research as well? (2) Since the $750 package was clearly not worth the money and did not result in useful clarification, what benefit is there to being a client? (3) Jupiter has been unresponsive and hostile to legitimate questions, do they behave the same with clients? (4) Is Jupiter’s research methodology valid?
And my added question as a PR person(that I blogged about last week) is (5) Why send a release to the media if you don’t want to talk to the media? Schatsky’s explaination is not acceptable. There are ways to make public announcements without directing them to the media. You can issue statements, alerts, announcements, etc. They won’t be secret, but at least you’re not pretending to be ready to take questions from the media when you’re not.
Jupiter Research Makes Some Changes…
Checking blog stats one finds interesting links .. a referral to David Schatsky’s, President JupiterKagan, blog. Both David and Greg Dowling, the Jupiter Research analyst of the infamous Jupiter Research corporate weblog study, provide additional info…
Interesting post from David Schatsky at http://weblogs.jupiterresearch.com/analysts/schatsky/archives/016492.html
Thanks Ian. Saw that: Toby’s got it covered!
All very good points, Christie. Reading David Schatsky’s post that Ian links to, you can see a very interesting comment there:
It looks like some recognition that Jupiter could have handled this a lot better.
[…] What a strange comparison. Am I missing something here? I’d love to know what their survey methodology was. (Actually, better not go there with such a question.) […]
[…] http://nevillehobson.com/2006/07/12/the-twisted-jupiter-tale/ and http://nevillehobson.com/2006/07/12/give-dell-a-chance/ […]
[…] * ATTENTION: Post Deleted! The post The Twisted Jupiter Tale was written by its respective owner from http://nevillehobson.com and was originally posted on: http://nevillehobson.com/2006/07/12/the-twisted-jupiter-tale/ […]
Thanks for your great coverage of this issue. (As I mentioned in my e-mail to you), I’ve published some of my final thoughts on this topic on HealthCareVox. Please click here to read them.
Thanks, Fard. You’ve written an interesting post, especially your conclusions. Further thoughts to come.
[…] The twisted Jupiter tale […]
Neville – Thanks for your support on 2 important issues: how research is presented in a public forum e.g., media releases and blogger relations. My thanks also to your readers who helped demonstrate the influence of ‘community’ to change the business processes of an organization. Hopefully, the results will be a win-win for marketers and for Jupiter Research.
It’s been a very interesting experience, Toby. Thanks to you for pinpointing the matter in the first place. I’m with you entirely re the win-win.
Neville – thanks for posting this article. John Mims contacted me the other day about the validity and reliability of these results. As a market researcher with over a decade of experience, I would be cautious of a sample size that small (251)considering the relatively large universe of executives fitting the criteria of decison makers in a company with revenues over $50 million. As John mentioned, it should be reported whether each of the respondents was from a different company and if this is a representative cross-section of the US. By no means should this be interpreted as indicative of F500 companies (companies with revenues above $4 billion). I would feel more confident with a sample of 400 executives from a random-select study. There are definitely lots of other nagging questions such as how the survey was recruited and administered. Of course, there may be more caveat emptors, but it hard to tell without more disclosure.
Buyers of secondary research should expect more level of detail about the methodology before purchasing a syndicated report. I believe a research firm puts its credibilty on the line when posting a PR release based on the study results. The caveats of the methodology should be mentioned as well if any. Clearly, there were missteps in with this PR release.
The positive outcome from this attention may be that, indeed, Jupiter begins to disclose more methodological details about its studies (see Ian’s link on Mr. Schatsy’s response). Currently, there are no methodology details whatsoever when clicking the link to purchase the report — unbelievable! (in more ways than one)
Good points, Brian. Let’s see what happens the next time Jupiter announces a new research report.