Table of Contents
The analytics wunderkinds at the National Science Foundation Office of Inspector General have come up with a new tool that can scan a thousand documents an hour, according to IG Allison Lerner, another innovation like the agency’s use of data analytics that is designed to better hunt for misspending by NSF awardees (and compliance by NSF itself to various requirements).
OIG first embraced data analytics in 2012, a then-controversial audit practice that involved assigning a risk score (not shared with institutions) and then combining awarding agency data with “awardee transaction data” from a general ledger, subsidiary ledger, subaward data and externally reported information, with “questionable transactions [referred] for review.” OIG’s 2013 work plan touted that the methodology “enables a review of 100% of applicable data, and reveals anomalies, such as unusual expenditure rates, for further investigation” and “is useful in identifying risks at all stages of awards (“‘Data Analytics’ Takes Center Stage in OIG Work Plan, Amid Contentious Audit Report,” RRC 10, no. 1).
Some questioned whether the method might lead to “false conclusions,” and institutions reported the practice resulted in hundreds of data requests, an observation that continues today (“Mind Those Reports! Purdue Shares Strategies On Heels of Successful Audit,” RRC 16, no. 9).