• Fighting Statistical Sampling in RAC Medicare Audits, Part I of II
  • March 24, 2009
  • Law Firm: Holland & Hart LLP - Denver Office
  • When Medicare contractors use “statistical sampling” techniques to extrapolate overpayment amounts from a universe of claims based on a limited sample – human process errors can provide the foundation for an effective defense. [This is the first of two articles related to RAC audits that were originally published in March of 2003, but seem relevant today.]

    It can be a shock to open a government overpayment determination notice and view the announcement of a recoupment action against your company to reclaim say $3,000,000.00 in alleged overpayments based on an agency’s “statistical sampling” methodology.  Welcome to the bewildering world of statistical sampling and extrapolation of data in the public funding of health care services.  The government has taken a “footprint” review of a few of your files, deemed your documentation wanting and has extrapolated a “dinosaur” determination that the universe of your submitted claims is similarly wanting and your company owes big bucks and maybe even the “farm” to the government.

    The agency formerly known as the Health Care Finance Administration[1](“HCFA”) adopted a rule permitting the use of statistical sampling techniques as part of the armamentarium of the government in the exercise of its program review and integrity function. The Social Security Act requires the government to review, identify and/or deny inappropriate, medically unnecessary, excessive or routine services.[2]  Sampling may be used where claim volume of a provider under review is “voluminous,” the claims reflect a “pattern of overbilling: and a case by case review is “not administratively feasible.”[3]

    Given the large number of claims processed by home health agencies and other health care providers relying on some level of federal reimbursement, the use of statistical sampling in lieu of a review of all of a provider’s claim files is a cost saving boon to federal and state governments administering Medicare and Medicaid programs.  It also has an extraordinary “in terrorem” effect on providers because of the process of extrapolation of small review samples into huge financial obligations cutting across all claims submitted during the audit period. 
    Home health agencies have been particularly vulnerable to statistical sampling problems because of the complexity and difficulty of maintaining adequate file documentation in a labor intensive enterprise performed in patient homes.  The government treats all services not adequately documented as not having been provided, and therefore the basis of an overpayment claim and recoupment action.

    A typical audit usually starts with a “random” selection and review of a small number of files to determine the adequacy of the documentation for claims previously presented for payment.  The files, once identified by a computer “randomizing” program, are reviewed by nurses or other trained personnel representing the government to see if there exists a “pattern” of overbilling.  If the government perceives a  pattern, a it selects and reviews a larger randomized sample of claims (i.e. 100).  The documentation error rate in the larger sample is then determined.  If the hapless provider has another 10,000 claims during the audit period, a computer program is used to extrapolate the error rate in the 100 files over the entire 10,115 files and pretty soon we are talking about big money.  The computer program, at the end of the process, usually spits out a high and low range of probably overpayment and the government usually selects the lower number (call it $2,900,000.00 instead of $3,000,000.00) just to show how conservative and careful the government is being.  Your company has just been mortally wounded if not actually killed by a computer in the hands of government statisticians.

    The seeming irrefutability of computer-generated numbers can be at least as terrifying as the size of the numbers generated through sampling techniques.  To the uninitiated (which includes most of us) the nature and practice of statistical analysis is arcane and impenetrable.  The memory of regression analysis problems from a required econometrics course in college still runs shivers down my spine.  How does one defend against the clinical determinism of computer driven mathematics?

    The answer is – human process flaws.  Ironically, the very agency cost concerns that led to the adoption of statistical sampling techniques also provide the seeds of defenses against them.  The reality is that there are significant costs in utilizing statistical sampling correctly and that review agencies either through lack of funding or ignorance with the technical requirements of the sampling process rarely get it right – leaving room for significant challenges to the validity and accuracy of  the final numbers. 


    [1] Now the ‘friendlier’ sounding “Centers for Medicare Services” (“CMS”).

    [2] Section 1842(a)(2)(6) of the Social Security Act.  See Also 42 C.F.R. §421.200.

    [3] HCFA Rule 86-1.