I've been thinking about a nonresponse bias analysis that I am working on for a particular project. We often have this goal for these kinds of analyses of saying we could lower response rates without increasing the relative bias. I wrote about this risks of this approach in a recent post.
Now I'm wondering about alternatives. There may be risks from lowering response rates, but is it wise to continue sinking resources into producing high response rates as a protection against potential biases? A recently read an article by Holle and colleagues where they actually worked out the sample size increase they could afford under reduced effort (fewer calls, no refusal conversion, etc.). They made explicit tradeoffs in this regard between the risk of nonresponse bias (judged to be minimal) and sampling error.
I'm still not completely satisfied with this approach. I'd like to see design that considers the risks and allocates resources proportional to the risk in some way. So the risk of nonresponse bias seems low (but still non-zero) for lowering the number of calls. Then maybe we ought to devote fewer resources to extended calling -- but some resources as a protection. This example sounds like two-phase sampling, but I don't think that is the solution for every problem of this type.
Now I'm wondering about alternatives. There may be risks from lowering response rates, but is it wise to continue sinking resources into producing high response rates as a protection against potential biases? A recently read an article by Holle and colleagues where they actually worked out the sample size increase they could afford under reduced effort (fewer calls, no refusal conversion, etc.). They made explicit tradeoffs in this regard between the risk of nonresponse bias (judged to be minimal) and sampling error.
I'm still not completely satisfied with this approach. I'd like to see design that considers the risks and allocates resources proportional to the risk in some way. So the risk of nonresponse bias seems low (but still non-zero) for lowering the number of calls. Then maybe we ought to devote fewer resources to extended calling -- but some resources as a protection. This example sounds like two-phase sampling, but I don't think that is the solution for every problem of this type.
Comments
Post a Comment