Detecting Real Activities Manipulation: Beyond Performance Matching
Thomas A. Gilliam
Abacus, 2021, vol. 57, issue 4, 619-653
Abstract:
The use of real activities manipulation (RAM) to mislead stakeholders has garnered the focus of earnings management research. A prominent feature of RAM research is its use of estimation models in conjunction with performance matching. The veracity of this research dependents on performance matching to mitigate estimation bias. This paper provides comprehensive tests of the RAM models and their performance‐matched counterparts and puts forth a set of model modifications to directly address weaknesses in the models’ specifications. In comparative tests, the performance‐matched models leave residual bias, while the modified models remove it. Similarly, in test of power, with simulated RAM, the performance‐matched models fail to detect high levels of simulation, for example, 5% of assets, while the modified models demonstrate power. These results, combined with the examination of actual and counterfactual RAM settings, call into question the use of the performance‐matched RAM models. On the other hand, the additional tests provide further evidence of the modified models’ accuracy and ability to detect RAM.
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (4)
Downloads: (external link)
https://doi.org/10.1111/abac.12221
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:bla:abacus:v:57:y:2021:i:4:p:619-653
Ordering information: This journal article can be ordered from
http://www.blackwell ... bs.asp?ref=0001-3072
Access Statistics for this article
Abacus is currently edited by G.W. Dean and S. Jones
More articles in Abacus from Accounting Foundation, University of Sydney
Bibliographic data for series maintained by Wiley Content Delivery ().