3073 Supervision and Review Considerations When Using Technology Solutions
Jun-2020

Overview

This topic explains:

  • Supervision and review when using technology solutions
  • Additional supervision and review considerations when using a visualization
  • Supervision and review considerations when using an approved tool

CPA Canada Assurance Standards

Office

Annual Audit

Performance Audit, Special Examination, and Other Assurance Engagements

CSQC 1.A35. A review consists of consideration of whether:

  • The work has been performed in accordance with professional standards and applicable legal and regulatory requirements;
  • Appropriate consultations have taken place and the resulting conclusions have been documented and implemented;
  • There is a need to revise the nature, timing and extent of work performed;
  • The work performed supports the conclusions reached and is appropriately documented;
  • The evidence obtained is sufficient and appropriate to support the report; and
  • The objectives of the engagement procedures have been achieved.

CAS 300.A16 The nature, timing and extent of the direction and supervision of engagement team members and review of their work vary depending on many factors, including:

  • The size and complexity of the entity.
  • The area of the audit.
  • The assessed risks of material misstatement (for example, an increase in the assessed risk of material misstatement for a given area of the audit ordinarily requires a corresponding increase in the extent and timeliness of direction and supervision of engagement team members, and a more detailed review of their work).
  • The capabilities and competence of the individual team members performing the audit work.

CAS 220 contains further guidance on the direction, supervision and review of audit work.

CAS 220.17 On or before the date of the auditor's report, the engagement partner shall, through a review of the audit documentation and discussion with the engagement team, be satisfied that sufficient appropriate audit evidence has been obtained to support the conclusions reached and for the auditor's report to be issued. (Ref: Para. A19-A21)

CAS 220.A18 A review consists of consideration whether, for example:

  • Appropriate consultations have taken place and the resulting conclusions have been documented and implemented;
  • The work performed supports the conclusions reached and is appropriately documented;
  • The evidence obtained is sufficient and appropriate to support the auditor’s report; and
  • The objectives of the engagement procedures have been achieved.

Considerations Relevant Where a Member of the Engagement Team with Expertise in a Specialized Area of Accounting or Auditing Is Used (Ref: Para. 15-17)

CAS 220.A21 Where a member of the engagement team with expertise in a specialized area of accounting or auditing is used, direction, supervision and review of that engagement team member’s work may include matters such as:

  • Agreeing with that member the nature, scope and objectives of that member’s work; and the respective roles of, and the nature, timing and extent of communication between that member and other members of the engagement team.
  • Evaluating the adequacy of that member’s work including the relevance and reasonableness of that member’s findings or conclusions and their consistency with other audit evidence.

Responsibilities of the Engagement Partner

CSAE 3001.37 The engagement partner shall take responsibility for the overall quality on the engagement. This includes responsibility for:

(c) Reviews being performed in accordance with the firm’s review policies and procedures, and reviewing the engagement documentation on or before the date of the assurance report; (Ref: Para. A73)

CSAE 3001.A73 Under CSQC 1, the firm’s review responsibility policies and procedures are determined on the basis that the work of less experienced team members is reviewed by more experienced team members.

CSAE 3001.A201 Documentation may include a record of, for example:

  • The identifying characteristics of the specific items or matters tested;
  • Who performed the engagement work and the date such work was completed;
  • Who reviewed the engagement work performed and the date and extent of such review; and
  • Discussions of significant matters with the appropriate party(ies) and others, including the nature of the significant matters discussed and when and with whom the discussions took place.

OAG Guidance

When using technology solutions in the audit, there are two areas of focus to be considered:

  • Documentation Standards—The sufficiency of audit documentation included within the workpapers. and

  • Supervision and Review—The documentation needed to assist the reviewer in meeting their supervision and review responsibilities in the normal course of the audit.

Supervision and review when using technology solutions

The reviewer is responsible for reviewing the output from a technology solution in the context of the audit procedures being performed, as well as reviewing any inputs, including checking that the correct source data and parameters (e.g., time period) were used. This also includes reviewing procedures performed over the completeness and accuracy of the source data. The reviewer is also responsible for understanding the objective of the procedures and how that objective is supported by the technology. How a reviewer accomplishes these responsibilities may vary based on how the technology is used in the audit, the complexity of the technology and the underlying evidence available.

For example, when Excel is used to perform a substantive analytic of interest expense, the reviewer may review the formulae in the Excel workbook maintained in the workpapers or may recalculate a few of the key formulae independently either within Excel, another technology or with a calculator. When an IDEA project, or Robotic Process Automations (RPA) solution is developed to perform the same substantive analytic, the reviewer may open the workflow in the IDEA Script or project in IDEA or the automation in the RPA software and walk through the workflow with the preparer. If the preparer has included text boxes explaining each function, the reviewer may be able to review that documentation to understand the functions performed. They may re-perform some of the functions independently, either in IDEA, Excel or with a calculator.

The same concepts apply when reviewing a visualization. For example, when reviewing a bar chart in IDEA reflecting the dollar value of journal entries posted by person, with the Y axis being the dollar value and the X axis being the people, the reviewer may open the visualization file and click on the bars for each person, which will display to the reviewer the number of entries posted by person and the exact dollar amount, allowing the reviewer to agree a selection of items back to the source data.

The reviewer is not evaluating whether the technology performed its underlying functionality correctly (i.e., we are not testing whether the coding or logic of the "off-the-shelf" technology, such as IDEA , performs core function such as summations or filtering appropriately); rather, the consideration is how the engagement team member applied the functionality, that it was appropriate to achieve the intended objective, and that it was applied to the appropriate population. For example, if a team creates a pivot table in Excel to summarize the different revenue streams subject to testing, the reviewer would likely check whether: (1) the correct filters were applied in the pivot, (2) the appropriate function was used (i.e., summed or averaged the appropriate amounts) and (3) whether the entire population was included in the pivot. The same considerations would be relevant when a similar function is performed in IDEA or another technology.

When reviewing a more complex technology solution, the reviewer may rely on a review of the supporting documents created during the development process, such as a design document and a test plan and results, to gain evidence over the proper operation and application of the technology. For example, assume an engagement team performs access control testing 15 different times within the context of one engagement. The engagement team decides to develop a bot using RPA technology to test whether access for terminated employees was removed timely. Data Analytics and Research Methods team  defines the objectives of the test, programs the bot and performs testing on the bot to validate it works as intended. The Data Analytics and Research Methods team would run the bot for each population (in this instance 15 times). In this scenario, the strategy, design, testing and results of testing documentation for the bot would be retained centrally in the audit file and referenced in the workpaper where the bot was run. In performing the review of the testing performed, the engagement team reviewer would understand the objectives and functions of the RPA and the testing performed to validate the bot was functioning as intended.

When supervising and reviewing the use of a technology solution by a specialist in accounting or auditing also refer to OAG Audit 3101.

Additional supervision and review considerations when using a visualization

Design visualizations in a way that effectively communicates the intended information and message. For example, a user’s interpretation of a graphic can be impacted by the scaling of the vertical or horizontal axis. The relative lengths of bars in a bar chart may be used to indicate the significance of differences in variables. However, the perception of significance can be impacted by the scale of the axes, and other aspects of the graphic. If a graphic is improperly designed, it might result in us failing to identify an important matter that requires additional focus. It might also lead us to identify a matter for follow-up when no further work is warranted. As an example, the two graphics below give a significantly different impression of the variation in sales between regions, even though they were both created using the same data.

Column graphs

The reviewer of a visualization considers whether the graphic has been designed in such a way that makes it difficult to understand, presents an incomplete or inaccurate picture of the audit risk being addressed or inaccurately portrays a topic by using the wrong criteria (e.g., inappropriate data used to define the x and/ or y axis).

The filters that are applied when creating a visualization in a software tool such as IDEA can also change how a user is likely to interpret the data depicted in the visualization. Therefore, a reviewer also needs to understand and assess the appropriateness of how filters are being used in the visualization.

Supervision and review considerations when using an approved tool

In cases when the engagement team is using a technology solution approved for use by the Office, the reviewer is still responsible for understanding the scope and objectives of the use of the tool on the engagement, assessing the suitability of the tool for audit procedure in which it is being used, reviewing the output in the context of the audit procedure being performed, and reviewing the appropriateness and reliability of any inputs into the tool. However, in accordance with OAG Audit 3072 concerning responsibilities when an engagement team uses an approved software tool, the engagement team is not responsible for testing, documenting the reliability of the tool itself (including its coding and logic) nor the completeness and accuracy of the tool output. The engagement team is, however, responsible for performing audit procedures on the output, including taking any follow-up actions that may be necessary as a result of performing the procedure.

For example, when Data Analytics and Research Methods specialists utilize an approved tool to evaluate configurable settings, automated controls, user access and the associated segregation of duties conflicts for a specific ERP application for an assurance engagement, it is the engagement team’s responsibility to understand and document the scope and purpose of the work performed, such as what ERP environments and company codes are subject to the tool, what audit procedures the use of the tool is supporting and whether the tool is suitable for such procedures, as well as the performing audit procedures on the outputs, such as if any users had been assigned a superuser attribute. However, the reliability of the function and the completeness and accuracy of outputs of the tool will have been validated as part of the Office’s approval process and would not be the responsibility of the engagement team.