|Anyone who has purchased prescription drugs recently knows that they are expensive and the cost is going up. Part of the reason for this is that pharmaceutical companies are required to put potential products through a stringent, time-consuming - and expensive - series of tests before licensing and sale can occur. The regulatory process requires the Food and Drug Administration (FDA) to approve a company's protocol before the various tests can begin and then also approve the company's statistical analysis once the tests are complete. Obviously, if the FDA and the pharmaceutical companies are unable to agree on what is acceptable, the consumer is the ultimate loser. During the past 10-15 years, computing technology has advanced rapidly. With it, statistical methods that were inconceivable in the early 1980's are now routine. Used correctly, these new methods can make better information available for greatly reduced cost. Used indiscriminately, they can make very confusing information available at greatly increased cost. For the past several years I have been involved in developing and evaluating statistical methods that are heavily used in pharmaceutical research. My main focus has been to help industry and the FDA understand what methods are effective and what methods aren't. In its own small way, facilitating this kind of understanding reduces the cost of regulation. I will present some of the major issues and share some experiences.