I spent last Thursday and half of Friday in the first Journal of Information Systems Conference (JISC), co-sponsored by the AICPA and Caseware IDEA. This conference was the first that brought academic researchers from the American Accounting Association’s (AAA) Accounting Information Systems (AIS) section together with professionals from the AICPA’s Information Management and Technology Assurance (IMTA) section.
I was first approached by Roger Debreceny, Ph. D. and senior editor of the journal over a year ago. At that time I was chair of the IMTA Executive Committee, and Roger had already approached the AICPA’s Assurance Services Executive Committee (ASEC) to provide facilities and meals for the conference. But what he really needed was access to professionals who could help provide IT audit-related information about the clients that they were working with. This IMTA could definitely provide.
IMTA ended up assisting by releasing two surveys to our membership to help two different teams of researchers analyze different sets of information. IMTA also provided “professional commentators” for the final 17 research papers that were produced, and five of us agreed to attend the conference and provide in-person comments and participate in the discussions surrounding the papers and two panels.
All five of the papers we discussed were interesting:
- Adoptions of Automated Internal Control Testing Strategies and External Auditor Perceptions of the Strength of the Internal Audit Function* (Limor and Farkas)
- Developing a Measure for Information Systems Control Alignment* (Cram and Gallupe)
- The Association between Information Technology Investments and Audit Risk (Han, Rezaee, Xue, and Zhang)
- Is There an Association Between Internal Control Findings and Measures of Information Security Effectiveness?* (Steinbart, Dilla, Gal, and Raschke)
- Repairing Organizational Legitimacy Following IT Material Weakness: Executive Turnover, IT Expertise, and IT System Overhaul (Haislip, Masli, Richardson, and Sanchez)
The three with the asterisk (*) were particularly interesting to me through because I saw definite practical application of their findings.
Audit Automation and Continuous Controls Auditing/Monitoring
This topic has been floating around the profession for a while, but has not gained a lot of adoption. The research preformed examined the correlation of Competence, Work Performance, and Objectivity with external auditor reliance on an internal audit (IA) function. Interestingly, they found a lower frequency (weekly) of IA testing resulting in a higher level of reliance by the external auditor. Additionally automation (or lack thereof) didn’t appear to affect the level of reliance.
This seems counterintuitive. So if testing is performed weekly instead of in real-time, then the external auditor would reply upon it more. And the external auditor didn’t care whether testing was performed by a tool or manually. I would have thought that the real-time automated testing would produce the highest level of reliance. It almost makes me wonder if in this circumstance whether the external auditors feared more that they could be replaced by the computer and thus trusted the automated process less almost as a way to spite and invalidate their personal fear.
Whatever the case may be, the research provides guidance to those of us who consult with auditors (whether as a consultant or part of an audit committee) and design the internal audit function and their programs. We now know that as we look to design any time of audit automation or continuous controls auditing or monitoring programs, that we should use a periodic frequency rather than real-time. And the research provided additional details on the correlation of Competency, Work Performance, and Objectivity which we can use to self-assess the potential reliance that an external auditor may place on the IA function. If we find that the external auditor is not relying upon IA, then we have some guidance on what to look at where we can potentially positively affect the level of reliance.
Information Systems Control Alignment
The model presented in this research helped to fill two gaps for me in the design of internal control systems. The research looked at four distinct dimensions of controls that tie pretty closely to the COSO model (see Figure 1 below), except for this concept of the Social-emotional Consequences.
The Social-emotional Consequences dimension dealt with the human elements related to implementation and execution of the controls. For example, if the controls were felt to be too restrictive would the person performing the control “rebel” and somehow undermine the control. This was one of the gaps that got filled for me. When doing controls design, I think I often did this subconsciously—I knew that if I put certain controls in place for a particular client or group of people at a client, that they would choose not adhere to the controls. So I always try to adapt the best practice control to the client’s corporate culture and risk tolerance. Now I have a way to describe what I was doing and explain why it is important.
The research also defined a scale for each of the areas that effectively ranged from “old school” to “modern” (my words not theirs):
- Control Environment (Traditional – Progressive)
- Control Mechanisms (Formal – Informal)
- Control Execution (Stable – Evolutionary)
- Social-emotional Outcomes (Individual – Collective)
If you think of each as a sliding scale, the researchers’ theory was that top be effective, controls must be aligned vertically within a particular IS function (e.g. application development, information security, change management, etc.). Their researched showed their theory to be true.
This alignment approach filled the gap for me on how to design controls for the evolving agile and iterative methodologies that are making their way from software engineering into other operational areas. I will have to read more about the “modern” control mechanisms and figure out how to implement and monitor them so that I can help some of my more innovative clients better manage their risk without over constraining their people’s creativity.
Information Security Effectiveness
This last paper was the one that I was asked to comment on. The researchers designed a survey that measured the maturity of an entity’s information security program and correlated this to the likelihood of a severe information security incident like loss of reputation or loss of availability.
While I found what I perceived to be flaws in their data collection, I really liked their approach and felt that the survey was something that could be implemented to support consulting and audit activities. In fact as I was reading the paper, my mind kept switching to think how I could incorporate their survey and its association with the COBIT v4.1 Maturity Model into the services that my firm was providing to its clients.
Probably the biggest current limitation of the survey though was that much of its content requires someone at least moderately versed in information security lingo to complete the survey. So it would probably need to be a tool used by CITPs and CISAs. That’s not necessarily bad, but it would limit broader use and adoption of the survey.
More work and the next conference
So the papers discussed in the conference showed me that academic research does have the potential to help move the CPA profession forward, but more work needs to be done to take these academic findings and get them to the point where they can be “commercialized” and applied in practice. I wonder who will step up to help fill this gap?
The AICPA would probably be the best fit, but lack of subject matter expertise and resources on the IMTA staff makes it unlikely that anything would happen anytime soon. The AICPA’s current academic relations area is also primarily focused on student recruitment and pathways for them becoming CPAs—so that group is unlikely to be able to help much either. ISACA may also be too technical controls focused and not have enough exposure to the business outcomes produced by information security issues to provide meaningful data. Humm…
Well at least the conference was a success and we’ve already started discussions with another specialty area in the AICPA to look at “Big Data”. This topic I’m really curious to see if they are able to develop anything that could really be applied into the real world.