The real work of an evaluator is to help practitioners and leaders do a better job using their limited resources to provide the best services possible for their clients. This job becomes challenging if the results of a program evaluation indicate that programs are not serving clients as well as everyone assumes. Old methods of data collection might not have been adequate, methods of analyzing the data might not bring issues to light, assumptions of benefits might be false, and a myriad of improper uses of data went unnoticed by the organization. We discover the ugly side of program evaluations when programs elect to use data because it is easy to collect. They may use data that provided glowing reports in the past, or they elect to “make data say what they want it to say.” Unfortunately, “data” is so readily available that our fast-paced working world often causes people to grab the data that is the easiest to access, so long as it makes the point they want to make. The proper data to answer the research question may not be available because there has never been any effort to collect it. Data may be available, but no one has taken the time to analyze it properly. Someone may have analyzed the data, but the results of the analysis only make the program look like it wasted scarce resources without meeting clients’ needs. There are far too many reasons to grab data, even if it has been manipulated, and present it as valid findings. This book helps anyone involved in programs to stop pretending they have valid data and move toward evaluations that will protect scarce resources and provide the services their clients deserve. It is time to stop putting lipstick on the pig.Topics •Dealing With Low or No Standards for Data.• Collection.Dealing With Programs That See No Need to Move Beyond Satisfaction Surveys and Client Counts.•Dealing With Outcomes That Are Only Putting Lipstick on the Pig.•Dealing With a Demand for Positive Data Where No Such Data Exist.•Dealing With “Samples” That Do Not Represent the Population Served.•Dealing With a Highly Popular, but Failing Program.•Dealing With Issues That Exacerbate Data Analysis Issues.•Dealing With Misconceptions About What Constitutes “Scientific Data.”•Dealing With the Impact of Consultants and Sales Representatives.