44
1 Watching the Watchdogs: An Examination of the PCAOB Quality Control Inspection Reports on Triennially Inspected Audit Firms and the AICPA Peer Review Reports Srinivasan Ragothaman Professor, Division of Accounting and Finance Beacom School of Business, The University of South Dakota Vermillion, SD 57069, USA Tel: (605) 677-6430 E-mail: [email protected] [Accepted for presentation at the 2012 Deloitte Foundation / University of Kansas Auditing Symposium] Version2: April 2012 Comments are welcome. Comments from the workshop participants at the University of South Dakota are appreciated. The usual disclaimer applies. Please do not quote without prior permission from the author.

[Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: [email protected] [Accepted for presentation at the 2012 Deloitte Foundation

  • Upload
    hatuyen

  • View
    227

  • Download
    5

Embed Size (px)

Citation preview

Page 1: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

1

Watching the Watchdogs: An Examination of the PCAOB Quality Control

Inspection Reports on Triennially Inspected Audit Firms and the AICPA Peer

Review Reports

Srinivasan Ragothaman

Professor, Division of Accounting and Finance

Beacom School of Business, The University of South Dakota

Vermillion, SD 57069, USA Tel: (605) 677-6430

E-mail: [email protected]

[Accepted for presentation at the 2012 Deloitte Foundation /

University of Kansas Auditing Symposium]

Version2: April 2012

Comments are welcome. Comments from the workshop participants at the University of South

Dakota are appreciated. The usual disclaimer applies. Please do not quote without prior

permission from the author.

Page 2: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

2

Watching the Watchdogs: An Examination of the PCAOB Quality Control

Inspection Reports on Triennially Inspected Audit Firms and the AICPA Peer

Review Reports

Abstract

While several studies have examined the PCAOB inspection reports (part I) and/or

AICPA peer review reports (see Lennox and Pittman 2010, Casterella et al. 2009; Anatharaman

2012; Hillary and Lennox 2005), studies examining deficiencies in part II – the quality control

section of the PCAOB report are rare. This study examines the weaknesses identified in the

quality control section (that have been made public by the PCAOB after a lack of progress within

12 months by the firms) and compares them to the quality control weaknesses identified in

AICPA peer review reports – an apple to apple comparison. This paper reports on an analysis of

106 PCAOB Quality Control Inspection (PCAOB QC) reports for triennially inspected firms and

2,355 AICPA peer review reports for firms with less than 100 SEC audit clients. Both

univariate and multivariate tests indicate that PCAOB QC reports disclose a significantly higher

number of engagement performance (EP) deficiencies than peer review reports. Statistical tests

also indicate that the PCAOB QC reports disclose a significantly higher number of independence

weaknesses than peer review reports. Results are mixed for monitoring deficiencies. These

results indicate that the PCAOB inspectors are more thorough and tougher than peer reviewers

and may be providing ex ante incentives for auditors to improve audit quality (see DeFond

2010). Multiple regression results indicate that the engagement performance deficiencies are

positively and significantly associated with the number of clients per partner. The higher the

partner workload the higher is the number of EP weaknesses identified. The number of

professional staff is negatively related to the number of deficiencies. This study incrementally

contributes to the discussion on the PCAOB QC inspection reports and has implications for

auditors, regulators, and academics.

Key words: PCAOB inspection; quality control deficiencies; audit quality; peer review.

Page 3: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

3

Watching the Watchdogs: An Examination of the PCAOB Quality Control

Inspection Reports on Triennially Inspected Audit Firms and the AICPA Peer

Review Reports

“The most fruitful lesson is the conquest of one's own error. . . .

Whoever is ashamed of error will struggle against recognizing and

admitting it, which means that he struggles against his greatest

inward gain.” - Goethe, Maxims and Reflections

1. Introduction:

Francis (2011) groups audit quality metrics into input and output measures. One of the

four categories of output measures suggested by Francis (2011) is the regulatory inspection

report. Bedard et al. (2010) indicate that one of the measurable outputs of audit quality is Public

Company Accounting Oversight Board (PCAOB) inspection activities and reports. Peer review

results are also considered to be a measure of audit quality by them. Abbott et al. (2008) argue

that PCAOB inspection reports have become a “much more recognizable and accepted publicly

available indicator of audit quality.” In this paper, I examine, using statistical analyses, the

differences in these two output measures of audit quality – the PCAOB quality control inspection

reports and the AICPA peer review reports.

While several studies have examined the PCAOB inspection reports (part I) and/or

AICPA peer review reports (see Casterella et al. 2009; Anatharaman 2012; Lennox and Pittman

2010; Hillary and Lennox 2005; DeFond (2010); Glover et al. (2009); and others), studies

examining the part II – the quality control section of the PCAOB report are rare. This study

examines the weaknesses identified in the quality control section (that have been made public by

the PCAOB after a lack of progress within 12 months by the firms) and compares them to the

quality control weaknesses identified in AICPA peer review reports.

Page 4: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

4

Leenox and Pittman (2010) report that the signaling role of peer review reports arises

from the quality control weakness information that the PCAOB inspectors do not generally

disclose for the first 12 months and never disclose for those firm that fix the problem in 12

months. However, with the passage of time and after many inspections, we now have data from

PCAOB for 107 firms that did not remediate their quality control problems. Out of 1,518 firms

inspected by the PCAOB (as of January 25, 2012), only 107 firms have had their quality control

(part II) criticisms made public1. Out of these 107 firms, only one firm is a large CPA firm with

more than 100 SEC audit clients and is excluded from the statistical analyses. The other 106

firms who are triennially inspected by the PCAOB and for which we have information from the

quality control section (part II) of the PCAOB inspection report constitute our sample.

This paper makes the following incremental contributions to the literature. I extend the

prior research in the area of PCAOB inspection reports in four ways. First, while several studies

have examined the PCAOB inspection reports (part I) and/or AICPA peer review reports, this

study examines the weaknesses identified in the PCAOB QC reports and compares them to the

quality control weaknesses identified in AICPA peer review reports – an apple to apple

comparison. Second, the PCAOB QC reports are compared both to 1) all peer review firms and

2) to peer review firms that received modified or adverse opinions using univariate and

multivariate tests. Third, I use an OLS regression to examine the relationship between engagement

performance deficiencies and CPA firm characteristics with interesting explanatory variables. 1 "The quality control remediation process is central to the Board's efforts to cause firms to

improve the quality of their audits and thereby better protect investors. The Board therefore takes

very seriously the importance of firms making sufficient progress on quality control issues

identified in an inspection report in the 12 months following the report. …... The Board can and

does make the relevant criticisms public when a firm has failed to do so." (PCAOB statement on

Publication of Inspection Report Quality Control Criticisms – October 17, 2011)

Page 5: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

5

Fourth, the sample size of 106 PCAOB QC control reports and 2,355 peer review reports is

larger than some of the earlier studies (Casterella et al. 2009; Anatharaman 2012; Hillary and

Lennox 2005 ).

A big focus of this paper is on engagement performance deficiencies identified by the

PCAOB QC inspection reports. Engagement (audit) performance deficiencies are key defects in

the audit process. These are deficiencies that indicate that the engagement personnel have not

complied with applicable auditing standards or regulatory requirements or firm’s own standards

of quality. Croteau (2011) argues that both the PCAOB and the audit firms have to concentrate

on identifying the root causes of audit performance deficiencies2. According to Croteau (2011),

the PCAOB has started training its inspectors on the process of root cause analysis and the

PCAOB’s inspection processes have been revamped to include lessons learned from root cause

analyses. He also calls on the audit firms to include root cause analysis into firms’ own internal

quality control systems and address root causes of deficiencies.

Section two describes prior research in this area. Section three enumerates the PCAOB

inspection process and provides some descriptive statistics. Section four describes the data

sources. Univariate and multivariate (logit) results are discussed in section five. Section six

reports on the OLS regression results. Section seven provides a brief overall discussion of the

results and a summary.

2 Croteau (2011) states: “I believe that the identification and analysis of root causes are key

inputs into what I’ll call, for lack of a better description, the ‘audit performance feedback loop.’

A feedback loop is a common and powerful tool in control and oversight systems that can be

used to monitor and continually improve audit performance based upon evaluations of actual and

desired audit performance outputs.”

Page 6: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

6

2. Prior Research:

Prior research on AICPA peer review has produced mixed results. Some studies have

documented positive impacts of peer review while others have questioned the benefits of this

self-regulation (see Casterella et al. 2009; Hillary and Lennox 2005; Anantharaman 2012;

Lennox and Pittman 2010; DeFond 2010; Fogarty 1996; POB 2002; Abbott et al. 2008;

Hermanson et al. 2007; Palmrose 2006; Gunny et al. 2009; and Daugherty and Trevo 2010).

Casterella et al. (2009) examine the effectiveness of the AICPA peer review program using a

sample of 158 peer review reports and a unique data set from an insurance company that had to

deal with 213 audit related-claims. They conclude that AICPA peer review findings are a useful

predictor of audit failures in their sample and are effective in providing signals about firm-level

audit quality. Hillary and Lennox (2005) examine a sample of 1001 peer review reports and

conclude that peer review opinions are associated with perceived audit quality. They found that

firms receiving clean opinions gained clients and those receiving adverse or modified opinions

lost clients. Hillary and Lennox (2005) also report that firms that review other firms and larger

firms tend to receive more positive peer reviews.

Anantharaman (2012) examines a sample of 407 firms that received both a peer review

opinion and a PCAOB inspection report and find that peer reviewers with similar industry

experience as the peer reviewed firm tended to agree more with the PCAOB inspectors.

Interestingly peer reviewers in the same geographic area as the peer reviewed firm tended to be

more unfavorable than the PCAOB inspectors. She also found that peer reviewers with industry

expertise or from the same geographic area as the reviewed firms were able to provide opinions

that are informative about future audit failures. Gunny and Zhang (2009) examine a sample of

Page 7: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

7

295 PCAOB inspection reports. Gunny and Zhang (2009) use four proxy measures for audit

quality: “abnormal current accruals, propensity to restate earnings, propensity to just meet an

analyst forecast, and propensity to issue a going concern.” They find that lower audit quality is

positively associated with firms receiving a seriously deficient inspection report from the

PCAOB.

Lennox and Pittman (2010) examine 1,982 peer review reports and 545 PCAOB

inspection reports to understand audit quality signals. Their findings suggest that the audit clients

of PCAOB inspected firms do not find the inspection reports to be useful as a signal for audit

quality. They recommend that the PCAOB inspectors include an evaluative summary and a

quality rating of the firm they inspected. Colbert and O’Keefe (1995) show that for firms that

regularly and continuously participate in peer reviews audit quality gets better. The fact that even

after the PCAOB inspection became mandatory for firms that have issuer audit clients, these

firms subject themselves to peer reviews as well indicates that there are benefits associated with

peer reviews. Small firms like quality ratings! It is certainly possible that both peer reviews and

PCAOB inspections have their strengths and weaknesses (see Anatharaman 2012).

Offermanns and Peek (2011) examine market reactions to 224 first-round and 134

second-round PCAOB inspection reports issued between January 2005 and March 2010 and

conclude that these reports are a useful indicator of audit quality. They show that shareholders

care about the signals about audit quality contained in the inspection reports by documenting a

significant stock price reactions to these deficiency reports. They demonstrate that the

magnitude of these stock price reactions is about 29 percent of market response to earnings

announcements. Their findings strongly apply to small audit firms that audit less than 100

issuers. Landis et al. (2011) examine 770 PCAOB inspection reports issued between 2005 and

Page 8: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

8

2008. They argue that small audit firms have more opportunities to improve audit quality and

suggest that the PCAOB inspection reports can be used to motivate triennial (smaller) firms to

remediate the defects. They also report that the number of part I deficiencies identified are

decreasing with time. This declining trend is consistent with either improved audit quality over

time or a change in PCAOB inspection philosophy. Hermanson and Houston (2008) conclude

that firms that receive QC criticisms are smaller, have fewer audit resources and are

understaffed.

Weaknesses associated with AICPA peer reviews have been described in prior literature.

For example, Fogarty (1996) argues that peer reviewers are unlikely to detect key audit

deficiencies and describes peer reviews as “shrouded in secrecy.” Public Oversight Board (POB

2002) has identified several problems with the peer review3 and has indicated that peer reviewers

are not independent. POB (2002, p. 14) states that “ The AICPA and several of the Big 5 firms,

in view of some, saw POB’s role as one of a “shield” for the profession rather than as an

independent overseer.” Russell and Armitage (2006) report that peer review firms used a

loophole (self- selected engagements for peer review) to obtain a better peer review.

Independence concerns about peer reviews are not unique to accounting. Travis and Collins

(1991) argue that peer reviews at the National Science Foundation grant programs suffer from

“old boy networks” problem.

3. The PCAOB Inspection:

When several accounting scandals broke out in late 1990s and early 2000, the Congress

was forced to take action. Some of the largest scandals involving Enron and WorldCom resulted

3 POB (2002, p.15) states: “Another problem is that monitoring of firm’s accounting and auditing

practices by the peer review process has come to be viewed as ineffective, either as a diagnostic

or remedial tool. More important, the process has lost credibility because it is perceived as being

“clubby” and not sufficiently rigorous. . . . Other problems include the fact that the current

governance structure does not have the weight of a congressional mandate behind it.”

Page 9: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

9

in very large corporate bankruptcies. Due to its role as the auditor of these companies, Arthur

Andersen was at the center of several of these scandals and the firm collapsed when criminal

charges were brought against this Big 5 firm. The Congress passed the Sarbanes Oxley Act

(SOX) of 2002 with record speed. Section 101 of SOX established the Public Company

Accounting Oversight Board (PCAOB) to oversee the public accounting profession.4 Section

104 of SOX assigns PCAOB the responsibility to inspect registered public accounting firms and

to issue a report on its findings5. The PCAOB has adopted the supervisory model and uses a

risk-based audit approach to seek out areas where audit problems are most likely to occur. After

all an ounce of prevention is better than a pound of cure!

The PCOAB inspection report contains four parts. Part I of this report is captioned

“Inspection Procedures and Certain Observations.” Part I gives some information about the

registered firm that is being inspected – the name, number of audit offices, number of partners

(sole proprietor, shareholder), number of professional staff, number of issuer clients, dates of

inspection, etc. Part I also describes the type of inspection (audit engagement review and/or

Quality Control System review), number of issuers (names not identified) examined, and

significant deficiencies discovered during the audit engagement review. All of the above

information is made public by PCAOB by posting it on the Board’s website. However, results of

the Quality Control System review are not disclosed to the public initially.

4 In 1984, the U.S. Supreme Court suggested that the independent public audit is a “public

watchdog function” (see United States v. Arthur Young & Company, 465 U.S. 805 (1984). “An

auditor is a watchdog, but not a bloodhound.” - Lord Justice Lopes in Kingston Cotton Mills

case (1896) 5 SOX section 104 requires that firms with more than 100 issuer audit clients be inspected

annually and other firms with issuer audit clients numbering 100 or less be inspected once in

three years. More frequent inspections of smaller firms are permitted.

Page 10: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

10

Part II and part III are the non-public portions of the inspection report. Part II of the

PCAOB Inspection report is titled “Issues Related to Quality Controls” and it contains non-

public information and is omitted initially from public portion of the report. The inspected firm

is given a year to remediate the quality control deficiencies identified by the inspection team. If

these quality control criticisms are addressed to the satisfaction of the PCAOB, the criticisms are

not made public. If the inspected firm fails to address these deficiencies within 12 months, the

PCAOB issues an amended inspection report and publicly discloses these quality control

deficiencies. According to PCAOB QC 20 engagement performance6 standard refers to “the

quality control policies and procedures applicable to a firm's accounting and auditing practice

should encompass the following elements:

a. Independence, Integrity, and Objectivity

b. Personnel Management

c. Acceptance and Continuance of Clients and Engagements

d. Engagement Performance

e. Monitoring”

[Insert Table 1 about here]

Engagement Performance Quality Control (QC) deficiencies disclosed in Part II of the

PCAOB Inspection reports of the 106 sample firms are summarized in Table 1. Engagement

performance deficiencies are by far the largest number of weaknesses identified in Part II

6 According to PCAOB QC 20.18: “Policies and procedures for Engagement Performance

encompass all phases of the design and execution of the engagement. To the extent appropriate

and as required by applicable professional standards, these policies and procedures should cover

planning, performing, supervising, reviewing, documenting, and communicating the results of

each engagement. Where applicable, these policies and procedures should also address the

concurring partner review requirements applicable to SEC engagements as set forth in

membership requirements of the SEC Practice Section of the AICPA. [As amended, applicable

to a CPA firm's system of quality control for its accounting, auditing, and attestation practice as

of January 1, 2000, by Statement on Quality Control Standards No. 4.]”

Page 11: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

11

reports. There were 245 engagement performance deficiencies for these 106 firms with a mean

deficiency of 2.31 defects per firm. 61 of these EP defects relate to issues with technical

competence, due care, or professional skepticism. That is a severe indictment of basic audit

quality. 61/106 firms had this defect and were given 12 months to remediate and the firms did

not correct the problems. Some of the other major deficiencies (# in parentheses) identified

include: auditor communication with audit committee (45), concurring partner review (45),

appropriate procedures (23), fraud procedures (16), and testing appropriate to audit (15).

[Insert Table 2 about here]

Descriptive statistics for 2,355 peer review opinions are provided in Panel A of Table 2.

These peer review reports were issued between 1998 and 2008. This is a much larger dataset

than the one used in Hillary and Lennox (2005) which had only 1,001 peer review opinions.

Table 2 (panel A) indicates that a vast majority of the opinions - 2,280 out of 2,355 (96.8 %) are

unmodified. Only 64 (2.7 %) opinions are modified and an even smaller number – 11 (0.5 %)

are adverse opinions. The fact that only 75 of the 2,355 opinions are adverse or modified may

indicate either most firms had good quality control systems or deficiencies were identified but

the reports were not modified. It could also mean that the peer review teams did not find serious

weaknesses because they were “clubby” reviews (POB 2002). Out of the 2,280 unmodified

reports only 1,237 had zero weaknesses or are clean reports. The other 1,047 firms had one or

more weaknesses even though they received unmodified opinions. The average number of

weaknesses in unmodified reports is 0.832 while it is larger for the other two types of opinions.

Firms with modified opinions had a mean of 3.156 weaknesses while firms that received adverse

opinions had a staggering mean of 6.91 quality control weaknesses. These peer review opinions

keep the audit client information confidential. Panel B of Table 2 gives a description of types of

Page 12: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

12

weaknesses identified in the 106 PCAOB QC inspection reports. In these 106 inspection reports,

24 firms were cited for independence, integrity & objectivity deficiencies, 3 firms were cited for

personnel management deficiencies and 21 firms were cited for monitoring deficiencies. Not a

single firm was cited for client acceptance deficiency. A vast majority of the deficiencies (245 or

82.62 percent) cited in the PCAOB QC inspection reports belonged to the “engagement

performance” deficiency category. As discussed in Hillary and Lennox (2005), a large majority

of deficiencies identified in peer review reports also belonged to the “engagement performance”

category.

4. Data and Sample

The PCAOB makes available its inspection reports on its website as soon as it is ready to

post it. As of January 25, 2012, the PCAOB had posted inspection reports for 1,518 firms. Of

these 1,518 firms inspected by the PCAOB, only 107 firms (as of January 20, 2012) have had

their quality control (part II) criticisms made public. Out of these 107 firms, only one firm is a

large CPA firm with more than 100 SEC audit clients and is excluded from the statistical

analyses. The other 106 firms who are triennially inspected by the PCAOB and for which we

have information from the quality control section (Part II) of the PCAOB inspection report

constitute our sample. The PCAOB data were hand collected.

The database used in this study consists of several variables. These variables are:

ENGPERF = Engagement Performance deficiencies

INDEP = Independence deficiencies

MONITOR = Monitoring deficiencies

SECCLNT = Number of SEC clients

OFFICE = Number of audit offices

PARTNER = Number of partners, proprietors and shareholders

PROSTAFF = Number of professional staff including partners

The AICPA peer review dataset was obtained from Professor Clive Lennox. It contains the first

seven variables mentioned in above list for 2,379 peer review report from 1997 through 2008. Data

for 24 large firms that had 100 or more SEC audit clients were dropped and this study uses data for

the other 2,355 firms that had less than 100 SEC audit clients. This peer review dataset also

contains a new variable: peer review opinions.

Page 13: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

13

5.1: UNIVARIATE TESTS:

Descriptive data for key variables appear in tables 3 and 4. For peer review firms and

PCAOB inspected firms separately, these two tables report the mean and standard deviation for

variables used in this study. As reported in table 3, PCOAB inspected firms have more SEC clients

per firm than peer review firms. However, for peer review firms in our sample the average number

of audit offices, the average number of partners, and the average number of professional staff are

all higher. Interestingly, for PCAOB inspected firms, the average numbers of quality control

deficiencies related to independence, engagement performance, and monitoring are all higher.

[Insert Table 3 about here]

Table 3 presents the results of t-tests for mean differences between firms with peer review

reports (N=2,355 or 2,178 depending data availability) and firms with PCAOB quality control

reports made public (N=106) for different variables of interest. Here we are examining all peer

review firms for which data are available with the PCAOB inspected firms. T-test results confirm

(at the 1 percent level) that PCAOB inspected firms are more likely to have higher numbers of

engagement performance deficiencies and independence related deficiencies than peer review

firms. T-test results also confirm (at the 5 percent level) that PCAOB inspected firms are more

likely to have a higher number of monitoring related deficiencies than peer review firms. T-test

results strongly indicate (at the 1 percent level) that peer review firms are less likely to have a

higher number of SEC clients per partner than PCAOB inspected firms. The univariate results

provide statistically significant evidence supporting the hypotheses that peer review firms are more

likely to have higher number of partners, professional staff, and audit offices than the PCAOB

inspected firms.

[Insert Table 4 about here]

Page 14: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

14

Table 4 presents the results of t-tests for mean differences between firms with peer review

reports (N=75) and firms with PCAOB quality control reports made public (N=106) for different

variables of interest. Here we are examining only the peer review firms that received modified

or adverse opinion on their peer reviews with PCAOB inspected firms. The PCAOB makes

public their quality control criticisms only for those firms that do not remedy/improve their quality

control processes within 12 months of the PCAOB inspection report. One can argue that these

PCAOB QC firms are seriously deficient in their quality control processes and they should be

compared to peer review firms that have severe quality control problems as evidenced by their

receipt of adverse or modified opinion on their peer reviews.

T-test results confirm (at the 5 percent level) that PCAOB inspected firms are more likely

to have higher numbers of engagement performance deficiencies than peer review firms. T-test

results also confirm (at the 1 percent level) that PCAOB inspected firms are less likely to have a

higher number of monitoring related deficiencies than peer review firms. T-test results strongly

indicate (at the 1 percent level) that peer review firms are less likely to have higher number of

SEC clients per partner than PCAOB inspected firms. The univariate results do not support the

hypotheses that peer review firms are more likely to have higher number of partners, professional

staff, and audit offices than the PCAOB inspected firms and the t-test results are not statistically

significant.

Page 15: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

15

5.2: MULTIVARIATE TESTS I:

Using the independent variables in a multivariate context allows us to examine their

relative explanatory power and can lead to better predictions. The primary objective of a

multivariate technique is to classify entries correctly into mutually exclusive groups. An obvious

choice is Logistic Regression (LOGIT model). The dependent variable is a dichotomous (0,1)

variable representing the two groups, AICPA peer review reports (Peer) and PCAOB quality

control reports have been made public (PCAOB QC). We use the same independent variables as in

the univariate tests. LOGIT does not require that these explanatory variables have a mutivariate

normal distribution.

In this study, the following logistic regression (LOGIT) model is proposed:

Pr(Y=1|X) = F (0 + 1 x1 + 2 x2 +.....+ K xK )

The dependent variable Y is a dichotomous (0, 1) variable representing the two groups, PCAOB QC

reports (Y=1) and peer review reports (Y=0). The independent variables X1 , X2 , .... XK include the

number of Engagement Performance deficiencies, number of Independence deficiencies and the

number of Monitoring deficiencies described earlier. Specifically these explanatory variables are:

ENGPERF = # of Engagement Performance deficiencies

INDEP = # of Independence deficiencies

MONITOR = # of Monitoring deficiencies

It is assumed that no exact linear dependencies exist among X's across k, and that the

relationship between Y's and X's are non-linear or logistic (i.e., P(Y =1|X ) =

exp(K XK) / [1 + exp(K XK)].)

The null hypotheses would be: H0 : k = 0, where k = 1,...3;

Page 16: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

16

Logit Results: Logit results appear in table 5, columns I and II. In Logit model I (table 5, col. I)

the sample size is 2,461 – 2,355 peer review firms (all peer review firms in the Lennox database)

and 106 PCAOB QC firms that are triennially inspected. Null hypothesis 1 (H1) suggests that there

is no statistically significant difference in the number of independence deficiencies between peer

and PCAOB QC firms. The coefficient for INDEP variable is 1.287 and is statistically

significant at the .0001 level. This indicates that there is a significant difference in the number of

deficiencies reported by PCAOB QC reports and peer review reports. PCAOB QC reports are

significantly more likely to report a higher number of independence deficiencies than peer review

reports.

[Insert Table 5 about here]

Null hypothesis 2 (H2) suggests that there is no statistically significant difference in the

number of engagement performance deficiencies between peer and PCAOB QC firms. The

coefficient for ENGPERF variable is 1.060 (table 6, col. I) and is statistically significant at the

.0001 level. This indicates that there is a significant difference in the number of engagement

performance deficiencies reported by PCAOB QC reports and peer review reports. PCAOB QC

reports are significantly more likely to identify a higher number of engagement performance

deficiencies than peer review reports. AIPCA peer reviews have been criticized as clubby and

incestuous (Anatharaman 2012). Gunny and Zhang (2006) report that PCAOB inspection reports

are able to distinguish earnings quality while peer review reports failed to do so. Moreover

PCOAB inspectors can be more objective, since they work for a quasi-governmental entity7.

These inspectors do not have to worry about another CPA firm peer reviewing their work and

hence can be independent. Null hypothesis 3 (H3) suggests that there is no statistically significant

7 Gradison and Boster (2010) state: “We believe that this (remediation) provision has contributed

directly to observable, though hard to quantify, improvements in audit quality.”

Page 17: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

17

difference in the number of monitoring deficiencies between peer and PCAOB QC firms. The

coefficient for MONITOR variable is -0.403 and is not statistically significant. This indicates that

there is no significant difference in the number of monitoring deficiencies reported by PCAOB QC

reports and peer review reports.

In Logit model II (table5, col.II) the sample size is 181 – 75 peer review firms (only

peer review firms that received a modified or an adverse opinion) and 106 PCAOB QC firms that

are triennially inspected. Null hypothesis 1 (H1) suggests that there is no statistically significant

difference in the number of independence related deficiencies between peer and PCAOB QC

firms. The coefficient for INDEP variable is 0.299 and is statistically insignificant. This indicates

that there is no significant difference in the number of independence related deficiencies reported

by PCAOB QC reports and peer review reports. Null hypothesis 2 (H2) suggests that there is no

statistically significant difference in the number of engagement performance deficiencies between

peer and PCAOB QC firms. The coefficient for ENGPERF variable is 0.436 and is statistically

significant at the .0001 level. This indicates that there is a significant difference in the number of

engagement performance deficiencies reported by PCAOB QC reports than peer review reports.

PCAOB QC firms are more likely to report a significantly higher number of engagement

performance deficiencies than peer review reports. The PCAOB conducts inspections using the

supervisory approach. Its objective is to “conduct a risk-focused inspection program for

registered public accounting firms that evaluates and identifies areas of potential improvement in

the practices, processes and quality controls of these firms, ...” (PCAOB Strategic Plan, 2007-

2012). The PCAOB offers a reasonable amount of time (12 months) for audit firms to remediate

the QC criticisms. If the Board is satisfied that the QC criticisms have been satisfactorily

addressed, these criticisms remain non-public. The very fact that QC criticism will remain

Page 18: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

18

confidential at least for 12 months can influence firms’ attitudes and may incentivize them to

cooperate. As of January 25, 2012, the PCAOB has issued 1,518 inspection reports. Of these

1,518 firms inspected by the PCAOB, only 107 firms (as of January 20, 2012) have had their

quality control (part II) criticisms made public. In other words, 1,411 (92.96 percent) firms have

made satisfactory progress in addressing PCAOB’s concerns with regard to QC deficiencies or

have had no QC criticism to begin with8.

Null hypothesis 3 (H3) suggests that there is no statistically significant difference in the

number of monitoring deficiencies between peer and PCAOB QC firms. The coefficient for

MONITOR variable is -1.844 and is statistically significant at the 0.0001 level. This indicates that

there is a significant difference in the number of monitoring deficiencies reported by PCAOB QC

reports and peer review reports. Interestingly, peer reports are more likely to identify a higher

number of monitoring deficiencies than PCAOB reports. Both Logit models strongly support the

proposition that PCAOB QC reports are more likely to contain a significantly higher number of

engagement performance deficiencies while the results are mixed for independence and monitoring

deficiencies. PCAOB QC reports are more likely to contain a significantly higher number of

independence deficiencies in model I while peer reports are more likely to contain a higher number

of monitoring deficiencies in model II.

6. Regression results: A multiple regression model was developed to investigate the relationship

between engagement performance deficiencies and CPA firm characteristics. An ordinary least-

8 Olson (2008) reports that firms have made good faith efforts to remediate QC defects. For

example, firms have taken concrete actions to align partner compensation with audit quality.

Some firms have created national or regional-level committees or offices to audit quality issues

related to fair value determination, client acceptance & retention, training and others (Olson

2008). Firms have also made changes to audit methodologies and procedures to address PCAOB

concerns.

Page 19: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

19

squares regression model was developed to investigate the relationship between engagement

performance deficiencies and SEC clients per partner, number of audit offices, number of

professional staff including partners and the report type.

The multiple regression models used in this study are:

Model 1: ENGPERF = β0 + β1 Ln(PROSTAFF) + β2 OFFICES + β3 CLNT/PARTNER

+ β4 ReportType + ε

Model 2: Ln(1+ ENGPERF) = β0 + β1 Ln(PROSTAFF) + β2 Ln (OFFICES)

+ β3 Ln (1+CLNT/PARTNER) + β4 ReportType + ε

The null hypotheses used to test the multiple regression models are outlined as follows:

H1: No. of professional staff (PROSTAFF) has no significant effect on the number of

engagement performance deficiencies (ENGPERF)

H2: No. of audit offices (OFFICES) has no significant effect on the number of engagement

performance deficiencies (ENGPERF)

H3: No. of SEC clients per partner (CLNT/PARTNER) has no significant effect on the number of

engagement performance deficiencies (ENGPERF)

H4: Report Type (PCAOB =1; peer =0) has no significant effect on the number of engagement

performance deficiencies (ENGPERF)

[Insert Table 6 about here]

The Pearson correlation results are reported in Table 6. Correlations for variables in

regression models I and II are reported in Panels A and B of Table 6. The correlation analysis

results (see Panel A) indicate that the dependent variable (engagement performance deficiencies)

is significantly and positively related to the explanatory variable - “number of SEC clients per

partner.” This is along the expected lines since working on many SEC clients at the same time

increases partner workload and audit deficiencies could flow from overwork. Engagement

Page 20: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

20

performance deficiency is negatively related to the logarithm of number of professional staff and is

statistically significant. The independent variables – number of offices and number of professional

staff are positively and significantly correlated. Panel B results mirror the results of Panel A.

Because multicollinearity may be present in the data, diagnostic measures of collinearity are

obtained. Collinearity diagnostics are based on procedures recommended by Belsley et al.

(1980) who suggest that condition indexes in excess of 30 indicate moderate to strong

dependencies. The largest condition index observed in regression model I is 5.144 and the largest

condition index in model II is 6.169. Hence, the degree of collinearity present among independent

variables appears to be too small to degrade estimation results. Variance Inflation Factors (VIF)

were also calculated to identify multicollinearity problems. Despite a high correlation between

“number of audit offices” and “number of professional staff” (0.568), all of the VIF values are

equal to or less than 1.783 (in model I) and 1.416 (in model II). According to Myers (1990), VIF

values less than 10 are not a concern.

Combining these explanatory variables in a multivariate model enables us to study their

relative explanatory power. Also a multivariate model could lead to better predictions since we use

the information in these cross-correlations. An ordinary least-squares regression model was

developed to investigate the relationship between engagement performance deficiencies

(ENGPERF) and number of SEC clients per partner, number of professional staff and number of

audit offices. Regression methodology permits the testing of three null hypotheses

simultaneously. ENGPERF was the dependent variable and the three explanatory variables

mentioned earlier were used as independent variables. The regression coefficients, t-statistics,

and significance levels for the three models are reported in Table 7 - columns I and II. The two

Page 21: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

21

multiple regression models have adjusted R-squared of 19.3 and 15.8 percent. The model F-

statistic for both regressions is highly significant.

[Insert Table 7 about here]

The model I regression results (in column I of table 7) indicate that “clients per partner”

variable is statistically significant (t=3.79) at the 1 percent level. The regression coefficient is

positive. This result is quite intuitive in that the larger the number of audits a partner supervises the

greater is the probability that engagement deficiencies will be higher. When SEC audits are

remarkably complex and many accounting issues involve considerable amount of judgment, audit

partners are stretched thin. Thus, the null hypothesis that the number of SEC clients per partner

has no significant impact on engagement performance deficiencies (identified by PCAOB) is

rejected. In regression I, the other two explanatory variables are not statistically significant.

The model II regression results (in column II of table 7) indicate that the “clients per

partner” variable is again statistically significant (t=6.74) at the 1 percent level. The regression

coefficient is positive. This result is even stronger than the model I result and the null hypothesis

that the number of SEC clients per partner has no significant impact on engagement performance

deficiencies (identified by PCAOB) is easily rejected.

The “Ln of Number of professional staff” variable has a large t-statistic (t = -5.61) at a

0.01 level of significance and the coefficient is negative. If a higher number of professional staff

work on an audit engagement, it is more likely that a lower number of engagement deficiencies

will occur. Thus, the null hypothesis that the number of professional staff has no significant

impact on engagement performance deficiencies is rejected. The “Number of audit offices”

variable also has a large t-statistic (t = 3.60) at a 0.01 level of significance. Thus, the null

hypothesis that the number of audit offices has no significant impact on engagement

Page 22: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

22

performance deficiencies is also rejected. The “ReportType” variable has a big t-statistic (14.93)

at a 0.01 level of significance. PCAOB report type is positively and significantly associated with

the number of engagement performance deficiencies.

[Insert Table 8 about here]

Following Lennox and Pittman (2010), a logarithmic regression model is estimated in

order to address heteroskedasticity concerns, if any. The model estimated is: Ln(1+ ENGPERF)

= β0 + β1 Ln(PROSTAFF) + β2 Ln (OFFICES) + β3 Ln (1+CLNT/PARTNER) + β4

ReportType + ε and the results are reported in Table 8. Table 8 results are identical to Table 7

results and the same variables are significant and hence not discussed further.

[Insert Table 9 about here]

In a new Logit model (see Table 9), I investigate if there is a difference in the number of

engagement performance, independence and monitoring deficiencies in AICPA peer review

reports issued before August 26, 2004 (early reports) and after August 25, 2004 (later reports).

August 26, 2004 is the date on which the PCAOB issued its earliest inspection reports on its

examination of the big 4 firms’ audit practices. These four initial inspection reports attracted a

good deal of attention in the financial press (Farrel and Shadab 2005; Carlino 2005; Victor and

Lavitin 2004). The sample size is 2,355 – 1395 early reports (peer review reports issued before

8/26/2004) and 860 later reports ( peer review reports issued after 8/25/2004). Null hypothesis 1

(H1) suggests that there is no statistically significant difference in the number of independence

related deficiencies between early reports and later reports. The coefficient for INDEP variable is

0.420 (table 9) and is statistically significant at the 5 percent level. This indicates that early

reports had a significantly higher number of independence related deficiencies than later reports.

Null hypothesis 2 (H2) suggests that there is no statistically significant difference in the number of

Page 23: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

23

engagement performance deficiencies between early reports and later reports. The coefficient for

ENGPERF variable is 0.298 and is statistically significant at the .0001 level. Early peer review

reports contain a significantly higher number of engagement performance deficiencies later

reports. These results suggest that after the PCAOB started issuing inspection reports, small firms

have improved their audit audit quality or the small firms have learned how peer reviews work,

what the reviewers look for, and learned how to get better review reports over time. It could also

mean peer reviews themselves have become less rigorous. Anantharaman (2012) reports that there

are trends in PCAOB inspection reports and a similar trend (later reviews are more favorable)

could be occurring in peer reviews. In a similar vein, Lennox and Pittman (2010) also observe that

the information content of the peer reviews fell with the advent of PCAOB inspections.

7. Discussion, Summary and Limitations:

The PCAOB offers carrots (non-disclosure of QC criticisms) to auditors in order to

encourage them to improve audit quality. QC criticisms are kept confidential for 12 months and

if deficiencies are remediated to the satisfaction of the Board those criticisms are never made

public. Niemeier (see Johnson 2007) states that the initial non-disclosure of QC deficiencies

“reflects a legislative judgment that reports should be used as a tool to drive quality

improvements inside firms as opposed to a ratings system for public use.” This confidentiality

arrangement is quite a powerful tool and it can incent firms to improve QC practices. Lombarts

and Klazinga (2003) suggest that medical doctors in Netherlands were initially rather reluctant to

share the results of “visitatie” (peer review through site visits). They argue that the eventual

success of medical peer review among specialists in Netherlands is mainly due to the

confidential status of the results and “the fact that the visitaties are aimed at ‘getting better’

instead of eliminating the negative outliers.” Something similar could be going on with PCAOB

Page 24: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

24

QC criticisms. Since the PCAOB inspectors know that their criticisms will be kept confidential

and that the firms have 12 months to remediate, they maybe more through, find more QC

deficiencies, and offer more criticisms. Since the QC criticisms are kept confidential, firms may

cooperate with the Board and remediate as needed9.

PCAOB inspections use a risk-based approach to select audits for inspections. This

means audits in higher risk areas (such as fair value issues, revenue recognition, estimates,

judgment related issues, professional skepticism, etc.) are selected for inspection. Since the

inspectors are looking at areas where audit problems are likely, they are more prone to report a

higher amount of defects. The PCAOB approach to inspection is supervisory in nature. The

Board does not follow a purely regulatory approach which would mean just testing for

compliance with regulations (Olson 2008). A purely enforcement-based approach would have

investigation rather than inspection as its goal. Olson (2008) argues that “the supervisory model

is designed to identify areas for improvement, with a goal of raising the bar of practices in the

profession in the interest of enhancing the quality of auditing overall. . . . Under the PCAOB’s

model, firms are expected to initiate and take ownership of efforts to improve audit quality.”

Hence, inspectors are prone to identify more areas for improvement (QC deficiencies) than less.

The PCAOB inspectors are thought to be more independent than peer reviewers. When

inspectors are more independent, they can be more rigorous in their inspections and they have no

reason to look over their shoulders10

. Palmrose (2006) suggests that the PCAOB traded

9 Newman and Oliverio (2010) report that in a survey of auditors who were triennially inspected

by the PCAOB, respondents agreed that the PCAOB inspection is necessary and effective.

Those firms who had deficiencies in their first PCAOB inspection report were able to improve

their audit processes and obtained no-deficiency reports in their second inspection. 10

Gradison and Boster (2010) argue that PCAOB inspections help auditors to be professionally

skeptical and have made it somewhat easier for them to resist client pressures.

Page 25: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

25

experience for perceived independence when it hired its own batch of inspectors. However,

experience levels for inspectors have increased from 2004 to 2007. In 2004, the PCAOB

inspectors had average of 12 years of audit experience. In 2007, PCAOB inspectors for large

firms had a mean of 15 years of relevant experience while the inspection team leaders had an

average of more than 23 years of experience (PCAOB 2007). With more experience and a high

degree of independence, inspectors can be expected to find more defects. PCAOB inspectors

pick which audit files to inspect. On the contrary, firms could pick their peer reviewer under the

AICPA peer review program. Fogarty (1996) suggests that this allowed some firms to pick

“friendly” reviewers. Russell and Armitage (2006) report that about a fifth of the survey

respondents in their study indicated that the firm’s peer reviewer had permitted the reviewed firm

to self-select the audit files for peer review. Hence, when PCAOB inspectors choose audit files

for inspection , one would expect them to find more QC deficiencies than peer reviewers.

The PCAOB has enforcement powers as well11

. It appears that the PCAOB inspections

have indeed gotten tougher. Findings in this study suggest that PCAOB QC reports contain a

significantly higher number of EP deficiencies than peer review reports. In 2004, former SEC

Chief Accountant Lynn Turner suggested that “the PCAOB has to use these inspections to drive

changes in the [auditing] rules and, quite frankly, get tough on enforcement.” [Securities Law

Daily, August 27, 2004] SOX (2002) gave the PCAOB responsibility to “conduct

investigations and disciplinary proceedings concerning, and imposing appropriate sanctions

where justified upon, registered public accounting firms and associated persons of such firms”

(U.S. House of Representatives, Committee on Financial Services 2002). The PCAOB can

11

In 2012, the PCAOB sanctioned a big 4 firm with a penalty of $2 million for failing to

properly evaluate a pharmaceutical client’s accounting for sales returns and for not exercising

professional skepticism. It had earlier sanctioned another big 4 firm for $1 million for revenue

recognition related issues. A detailed discussion on PCAOB enforcement is available in Messier

Jr. et al. (2010) and Gilbertson and Herron (2009).

Page 26: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

26

interview audit committee members and can access any client documentation they need. Since

the PCAOB can impose costly sanctions such as fines, barring auditors from public company

audits in the future and referral to the Justice Department in case of potential criminal violations,

auditors have incentives to improve audit quality in order to avoid such negative consequences

(see DeFond 2010). After summarizing 25 years of auditing deregulation and re-regulation,

Kinney (2005) warns: “ … it seems improbable that relatively small private audit firm

partnerships hired by huge corporations as ‘‘public watchdogs’’ can survive under the present

regulatory and legal regime that has multiple sources of short-term financial ruin even if auditors

‘‘bark’’ when appropriate.” A sobering thought indeed!

Summary: While several studies have examined the PCAOB inspection reports (part I) and/or

AICPA peer review reports, studies examining deficiencies in part II – the quality control section

of the PCAOB report are rare. This study examines the weaknesses identified in the quality

control section (that have been made public by the PCAOB after a lack of progress within 12

months by the firms) and compares them to the quality control weaknesses identified in AICPA

peer review reports – an apple to apple comparison. Both univariate and multivariate tests

indicate that PCAOB QC reports disclose a significantly higher number of engagement

performance (EP) deficiencies than peer review reports. Statistical tests also indicate that the

PCAOB QC reports disclose a significantly higher number of independence deficiencies than

peer review reports. Results are mixed for monitoring deficiencies. OLS regression results

indicate that the engagement performance deficiencies are positively and significantly associated

with the number of clients per partner. The higher the partner workload the higher is the number

of EP weaknesses identified. The number of professional staff is negatively associated with the

Page 27: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

27

number of deficiencies. This study incrementally contributes to the discussion on the PCAOB

QC inspection reports and has implications for auditors and regulators.

Readers should be mindful of the limitations of this study. The sample size of 106

PCAOB QC reports is small and some caution is warranted in generalizing the results about

quality control processes at these firms. This peer review /PCAOB inspection research suffers

from relying on just counting the number of weaknesses. Some deficiencies maybe more severe

than others. If PCAOB or the firms will give access to more data, research could benefit from

that. This research has examined data related to small audit firms only. I have not yet included

time dummies as control variables in my regressions which will be done in May 2012 when I do

some more work on this paper. When more QC criticisms about large firms become publicly

available, additional investigations are possible. Meanwhile, the PCAOB inspection process can

be used as an early warning signal and it can become a decent method of lifelong learning, if

managed well and if the inspected firms and the inspectors both have the right perspective.

Acknowledgement: I am indebted to Professor Clive Lennox for giving me access to his

AICPA peer review dataset. PCAOB quality control inspection report dataset was hand

collected from the PCAOB website. I am thankful to Xiaoyan Wu for her able assistance with

data collection from the PCAOB website.

Page 28: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

28

Reference

Abbott, L. J., K. Gunny, and T. Zhang. 2008. When the PCAOB Talks, Who Listens? Evidence

from Client Firm Reaction to Adverse, Seriously Deficient PCAOB Inspection Reports. Working

paper presented at the 2008 AAA Annual Meeting.

Anantharaman, D. 2012. Comparing self-regulation and statutory regulation: Evidence from the

accounting profession. Accounting, Organizations and Society doi:10.1016/j.aos.2011.12.003

Bedard, J. C., K. Johnston and E. Smith. 2010. Audit Quality Indicators: A Status Update on

Possible Public Disclosures and Insights from Audit Practice. Current Issues in Auditing 4: C12-

C19.

Belsley, D.A.; Kuh, E.; Welsch. R.E. 1980. Regression Diagnostics: Identifying Influential Data

and Sources of Collinearity (New York: Wiley).

Carlino, B. 2005. Groundhog day for the Big 4? Accounting Today (July 27): 6.

Casterella, J. R., Jensen, K. L., & Knechel, R. W. 2009. Is self-regulated peer review effective at

signaling audit quality? The Accounting Review, 84(3): 713–735.

Colbert, G. J., and T. B. O’Keefe. 1995. Compliance with GAAS reporting standards: Evidence

from a positive enforcement program. Auditing: A Journal of Practice and Theory 14 (Fall):

1-16.

Croteau, B.T. 2011. The Role of the Audit Performance Feedback Loop in Audit Policy

Decision-Making. Remarks before the 2011 AICPA National Conference on Current SEC and

PCAOB Developments, December 5, 2011.

Daugherty, B., and Tervo, W. 2010. PCAOB inspections of smaller CPA firms: The

perspectives of inspected firms. Accounting Horizons, 24(2), 189–219.

DeFond, M.L. 2010. How Should the auditors be audited? Comparing the PCAOB Inspections

with the AICPA Peer Reviews. Journal of Accounting & Economics 49: 104-108.

Farrell, J. and H. Shadab. 2005. The focus of future PCAOB inspections. The CPA Journal

(September): 9.

Fogarty, T. J. 1996. The imagery and reality of peer review in the US: Insights from institutional

theory. Accounting, Organizations and Society, 21(2/3), 243–267.

Francis, J. 2011. A Framework for Understanding and Researching Audit Quality. Auditing: A

Journal of Practice and Theory (May), Vol. 30, No. 2, pp. 125-152.

Gilbertson, D., and Herron, T. 2009. PCAOB Inspections: A Review of the First Three Years.

Current Issues in Auditing, Vol. 3, Issue 2, A15-A34.

Page 29: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

29

Glover, S. M., Prawitt, D. F., & Taylor, M. H. 2009. Audit standard setting and inspection for

US public companies: A critical assessment and recommendations for fundamental change.

Accounting Horizons, 23(2), 221–237.

Gradison, B. and R. Boster. 2010. The PCAOB’s First Seven Years: A Retrospection. Current

Issues in Auditing 4(1): A9-A20.

Gunny, K., & Zhang, T. 2009. PCAOB inspection reports and audit quality. Unpublished

manuscript.

Gunny, K., Krishnan, G., & Zhang, T. 2009. Is audit quality associated with auditor tenure,

industry expertise and fees? Evidence from PCAOB opinions. Unpublished manuscript.

Hermanson, D. R., R. W. Houston, and J. C. Rice. 2007. PCAOB Inspections of Smaller CPA

Firms: Initial Evidence from Inspection Reports. Accounting Horizons 12 (2), 137-152.

Hermanson, D.R. and R. W. Houston. 2008. Quality control defects revealed in smaller firms’

PCAOB inspection reports. The CPA Journal 78 (12): 36–40.

Hilary, G., and C. Lennox. 2005. The Credibility of Self-regulation: Evidence from the

Accounting Profession’s Peer Review Program. Journal of Accounting and Economics, 40: 211-

229.

Johnson, S. 2007. A behind-the-scenes look at the accounting oversight board's auditor

inspections reveals why they're so hush-hush. CFO.Com January 26, 2007.

Kinney, W. R. Jr., 2005. Twenty-five years of audit de-regulation and reregulation: What does it

mean for 2005 and beyond? Auditing: A Journal of Practice and Theory (Suppl.), 89–109.

Landis, M., S. Jerris, and M. Brasswell 2011. An Account Analysis of PCAOB Inspection

Reports For Triennially-Inspected Audit Firms. Journal of Business & Economics Research 9(3):

11-18.

Lennox, C., & Pittman, J. 2010. Auditing the auditors: Evidence on the recent reforms to the

external monitoring of audit firms. Journal of Accounting and Economics, 49, 84–103.

Lombarts, M.J.M.H. and N.S. Klazinga. 2003. Inside self-regulation: peer review (visitatie) by

Dutch medical specialists. Clinical Governance: An International Journal, 8 (4): 318 – 330.

Messier, W. F., JR., T. M. Kozloski, and N. Kochetova-Kozloski 2010. An analysis of SEC and

PCAOB enforcement actions against engagement quality reviewers. Auditing: A Journal of

Practice & Theory (November): 233-252

Myers. R. 1990. Classical and modern regressions with applications. 2nd

Ed. Boston, MA:

Duxbury.

Page 30: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

30

Newman, B. and M. Oliverio. 2010. PCAOB Triennial Inspections of Small Firms. The CPA

Journal. New York State Society of CPAs.

Offermanns, M. and E. Peek 2011. Investor Reactions to PCAOB Inspection Reports. Working

Paper, Maastricht University.

Olson, M.W. 2008. The PCAOB Supervisory Approach and Current Market Challenges.

Remarks before the 2008 AICPA National Conference on Current SEC and PCAOB

Developments, December 8, 2008.

Palmrose, Z.-V.2006. Maintaining the value and viability of auditors as gatekeepers under SOX:

An auditing master proposal. Working paper, University of Southern California.

Public Oversight Board (POB). 2002. The road to reform: A white paper from the Public

Oversight Board on legislation to create a new private sector regulatory structure for the

accounting profession. Stamford, CT: POB (March 19).

Public Company Accounting Oversight Board (PCAOB) 2007. The Strategic Plan 2007-2012.

PCAOB, Washington, D.C.

Public Company Accounting Oversight Board. 2007. Annual Report. PCAOB, Washington,

D.C.

Russell, J., and Armitage, J. 2006. Peer review effectiveness: An analysis of potential loopholes

within the USA Peer Review Program. Managerial Auditing Journal, 21(1), 46–62.

Travis, G. D. L., & Collins, H. M. 1991. New light on the old boys: Cognitive and institutional

particularism in the peer review system. Science, Technology and Human Values, 16(3), 322–

341.

Turner, L.E. January 27, 2000. “Shifting paradigms in self-regulation” Speech at the 27th

Anniversary of the Securities Regulation Institute.

U.S. House of Representatives, Committee on Financial Services. 2002. Sarbanes-Oxley Act of

2002. Public Law No. 107-204. Washington, D.C.: Government Printing Office.

Victor, G. and M. Lavitin. 2004. Current SEC and PCAOB developments. The CPA Journal

(September): 26-30.

Page 31: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

31

Table 1: Quality Control Weaknesses related to Engagement Performance

identified in PCAOB Inspection Reports (Part II)

Technical Competence, Due Care, Professional

Skepticism

61

Appropriate Procedures

23

Communications with predecessor auditors/client

acceptance

2

Concurring Partner Review

45

Fraud Procedures

16

Engagement Completion Document

13

Related Party Transactions

1

Financial Statement Disclosures

3

Testing appropriate to audit

15

Consultation

1

Partner workload

8

Competency of Engagement Team

1

Audit Risk & Materiality

1

Planning, performance, supervision, review

1

Auditor Communication with Audit Committee

45

Retention of Records

1

Documentation

6

Review of interim information

1

Subsequent discovery of facts

1

Total no. of EP deficiencies in 106 Inspection Reports

245

Page 32: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

32

Table 2: Descriptive Statistics for AICPA Peer Review Opinions (N=2,355)

Panel A Opinion Type No. of

No. of deficiencies Unmodified Modified Adverse weaknesses

0

1

2

3

4

5

6

7

8

9

Mean no. of

weaknesses

1237

533

294

133

51

23

7

1

1

__0__

2280

0.832

0

10

12

16

15

8

1

2

0

__ 0__

64

3.156

0

0

0

0

1

2

2

1

2

_ 3_

11_

6.91

0

543

612

447

268

165

60

28

24

27

2174

Note: 2,174 weaknesses include all 6 categories of weaknesses: Independence, Integrity,

and Objectivity; Personnel Management; Acceptance and Continuance of Clients and

Engagements; Engagement Performance; Monitoring; and Compliance with membership

requirements of SECPS.

Panel B: Type of weaknesses disclosed in PCAOB QC Reports

Weakness Type Number % of weakness

Independence, integrity

and objectivity

Personnel Management

Client Acceptance

Engagement

Performance

Monitoring

24

3

0

245

21

8.19

1.02

0

83.62

7.17

Page 33: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

33

Table 3: DESCRIPTIVE STATISTICS AND UNIVARIATE TEST RESULTS

(AICPA peer review: N=2,355; PCAOB QC Inspection: N=106)

STANDARD

VARIABLES MEAN DEVIATION T-SCORES

ENGPERF -Peer 0.62 0.912

PCAOB 2.31 1.410 12.255**

INDEP -Peer 0.05 0.221

PCAOB 0.23 0.420 4.394**

MONITOR -Peer 0.10 0.309

PCAOB 0.20 0.400 2.430*

SECCLNT -Peer 4.76 10.157

PCAOB 9.80 11.819 4.311**

OFFICE -Peer 2.11 3.309

PCAOB 1.45 1.640 -3.760**

PARTNER -Peer 9.78 19.007

PCAOB 3.53 5.557 -9.252**

PROSTAFF-Peer 45.77 104.688

PCAOB 21.65 82.184 -2.337*

------------------------------

* - two-tailed significance at < 0.05 level.

** - two-tailed significance at < 0.01 level.

PCAOB firm = 1 and Peer review firm = 0 ENGPERF = # of Engagement Performance deficiencies

INDEP = # of Independence deficiencies

MONITOR = # of Monitoring deficiencies

SECCLNT = Number of SEC clients

OFFICE = Number of audit offices

PARTNER = Number of partners, proprietors and shareholders

PROSTAFF = Number of professional staff including partners

Page 34: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

34

Table 4: DESCRIPTIVE STATISTICS AND UNIVARIATE TEST RESULTS

(AICPA peer review: N=75; PCAOB QC Inspection: N=106)

STANDARD

VARIABLES MEAN DEVIATION T-SCORES

ENGPERF -Peer 1.83 1.339

PCAOB 2.31 1.410 2.326*

INDEP -Peer 0.21 0.473

PCAOB 0.23 0.420 0.196

MONITOR -Peer 0.56 0.551

PCAOB 0.20 0.400 -4.852**

SECCLNT -Peer 4.89 11.078

PCAOB 9.80 11.819 2.754**

OFFICE -Peer 1.39 0.892

PCAOB 1.45 1.640 0.268

PARTNER -Peer 3.71 4.502

PCAOB 3.53 5.557 -0.226

PROSTAFF-Peer 12.14 16.229

PCAOB 21.65 82.184 0.928

------------------------------

* - two-tailed significance at < 0.05 level.

** - two-tailed significance at < 0.01 level.

PCAOB firm = 1 and Peer review firm = 0 ENGPERF = # of Engagement Performance deficiencies

INDEP = # of Independence deficiencies

MONITOR = # of Monitoring deficiencies

SECCLNT = Number of SEC clients

OFFICE = Number of audit offices

PARTNER = Number of partners, proprietors and shareholders

PROSTAFF = Number of professional staff including partners

Page 35: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

35

TABLE 5: LOGIT ANALYSIS RESULTSa

I

N=2,461

II

N=181

CONST

-4.584

(542.62) **

0.050

(0.024)

INDEP

1.287

(24.18)**

0.299

(0.637)

ENGPERF

1.060

(152.10)**

0.436

(9.89)**

MONITOR

-0.403

(1.79)

-1.844

(24.52)**

-2 Log

Likelihood

661.39 210.64

Nagelkerke R-

squared

0.277 0.236

a The dependent variable is dichotomous: PCAOB firm = 1 and Peer review firm = 0

Wald Chi-Square statistic in parentheses

ENGPERF = # of Engagement Performance deficiencies

INDEP = # of Independence deficiencies

MONITOR = # of Monitoring deficiencies

[Model 1: N = 2,461; Model 2: N = 181]

*Statistically significant at 5% level

**Statistically significant at 1% level

Page 36: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

36

Table 6: Pearson Correlation Coefficients

Panel A: Regression model I variables (N=106)

ENGPERF LnPROSTAFF OFFICES CLNT/PART

ENGPERF 1.000

LnPROSTAFF -0.322 1.000

OFFICES -0.148 0.568 1.000

CLNT/PART 0.437 -0.407 -0.121 1.000

Explanation of the variables:

ENGPERF = No. of Engagement Performance deficiencies

LnPROSTAFF = Ln (No. of professional staff including partners)

OFFICES = Number of audit offices

CLNT/PART = Number of SEC clients per Partner

Panel B: Regression model II variables (N = 2,282)

ENGPERF LnPROSTAFF OFFICES CLNT/PART ReportType

ENGPERF 1.000

LnPROSTAFF -0.180 1.000

OFFICES -0.006 0.482 1.000

CLNT/PART 0.236 -0.258 -0.075 1.000

ReportType 0.352 -0.184 -0.042 0.243 1.000

Explanation of the variables:

ENGPERF = No. of Engagement Performance deficiencies

LnPROSTAFF = Ln (No. of professional staff including partners)

OFFICES = Number of audit offices

CLNT/PART = Number of SEC clients per Partner

ReportType = PCAOB =1, Peer = 0

Page 37: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

37

TABLE 7: MULTIPLE REGRESSION RESULTSa

I II CONSTANT 2.275

(8.40)**

0.817

(15.75)**

LnPROSTAFF -0.180

(-1.434)

-0.098

(-5.61)**

OFFICES -0.007

(-0.074)

0.025

(3.60)**

CLNT/PARTNER

ReportType

0.069

(3.79)**

0.037

(6.74)**

1.433

(14.93)**

N 106 2,282

Adj. R-squared

Model F

0.193

9.35

0.158

108.36

a The dependent variable is No. of Engagement Performance deficiencies

*Statistically significant at 5% level; **Statistically significant at 1% level

ENGPERF = # of Engagement Performance deficiencies

LnPROSTAFF = Ln (# of professional staff including partners)

OFFICES = Number of audit offices

CLNT/PARTNER = Number of SEC clients per Partner

ReportType = PCAOB =1, Peer = 0

Page 38: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

38

TABLE 8: MULTIPLE REGRESSION RESULTSa

I II CONSTANT 0.988

(7.92)**

0.341

(11.83)**

LnPROSTAFF -0.051

(-1.265)

-0.049

(-5.18)**

LnOFFICES 0.018

(0.124)

0.121

(4.67)**

LnCLNT/PARTN

ER

ReportType

0.150

(3.52)**

0.108

(6.61)**

0.617

(12.79)**

N 106 2,282

Adj. R-squared

Model F

0.192

8.09

0.136

90.72

a The dependent variable is Ln (No. of Engagement Performance deficiencies)

*Statistically significant at 5% level; **Statistically significant at 1% level

LnENGPERF = Ln (1 + # of Engagement Performance deficiencies)

LnPROSTAFF = Ln (# of professional staff including partners)

LnOFFICES = Ln (Number of audit offices)

LnCLNT/PARTNER = Ln (1 + # of SEC clients per Partner)

ReportType = PCAOB =1, Peer = 0

Page 39: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

39

TABLE 9: LOGIT ANALYSIS RESULTS a

I

N=2,355

II

N=2,355

CONST

0.169

(10.82)**

0.081

(2.353)

INDEP

0.420

(3.85)*

0.327

(2.114)

ENGPERF

0.298

(32.93)**

0.205

(13.79)**

MONITOR

0.133

(0.82)

-0.137

(0.738)

PERSMGMT 1.109

(16.73)**

CLNTACCPT 0.130

(0.062)

COMPLIANCE 2.957

(50.55)**

-2 Log Likelihood 3135.54 2966.94 Nagelkerke R- squared 0.027

0.118

a The dependent variable is dichotomous: 1=Peer review before 8/26/04

and 0 = Peer review after 8/25/04. 8/26/04 is the date on which the PCAOB issued

its earliest inspection reports. (Wald Chi-Square statistic in parentheses)

ENGPERF = # of Engagement Performance deficiencies

INDEP = # of Independence deficiencies

MONITOR = # of Monitoring deficiencies

PERSMGMT = # of Personnel Management deficiencies

CLNTACCPT = # of Client Acceptance deficiencies

COMPLIANCE = # of Compliance (SECPS) deficiencies

*Statistically significant at 5% level

**Statistically significant at 1% level

Page 40: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

40

Appendix A: Examples of Quality Control Criticisms (on Engagement

Performance) made public by PCAOB:

I. [PCAOB Release No. 104-2010-043A; March 31, 2010] a. Technical Competence, Due Care, and Professional Skepticism The Firm's system of quality control appears not to do enough to ensure technical competence and the exercise of due care or professional skepticism. b. Fraud Procedures The Firm's system of quality control appears not to provide sufficient assurance that the Firm will perform all of the required procedures in accordance with the provisions of AU 316, Consideration of Fraud in a Financial Statement Audit. Specifically, the Firm did not perform audit procedures to test journal entries and other adjustments for evidence of possible material misstatements due to fraud. [Issuers A and B] c. Engagement Completion Document The Firm's system of quality control appears not to provide sufficient assurance that the Firm will prepare an engagement completion document in accordance with AS No. 3, which is necessary to demonstrate that the work performed by engagement personnel addresses the significant findings and issues of the engagement. [Issuers A and B]

d. Auditor Communications The Firm's system of quality control appears not to provide sufficient assurance that the required auditor communications to the audit committee, or equivalent, occur and are appropriately documented. [Issuers A and B] e. Appropriate Procedures The Firm's system of quality control appears not to provide sufficient assurance that the Firm will conduct all testing appropriate to a particular audit. The information reported by the inspection team suggests an apparent pattern of failures to perform the appropriate procedures related to the testing of revenue [Issuers A and B] and equity transactions. [Issuer B] II. [PCAOB Release No. 104-2009-059A; April 27, 2009] a. Technical Competence, Due Care, and Professional Skepticism The Firm's system of quality control appears not to do enough to ensure technical competence and the exercise of due care or professional skepticism. b. Fraud Procedures The Firm's system of quality control appears not to provide sufficient assurance that the Firm will perform all required procedures in accordance with the provisions of

Page 41: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

41

AU 316, Consideration of Fraud in a Financial Statement Audit. Specifically, the Firm did not perform audit procedures to test journal entries and other adjustments for evidence of possible material misstatements due to fraud [Issuers A and B], and other than sign offs on an audit program and notations on a fraud risk information form, there was no evidence in the audit documentation, and no persuasive other evidence, that the Firm inquired of management and the audit committee, or others with equivalent authority and responsibility, as to their views about the risk of material misstatement due to fraud [Issuers A and B], or conducted a brainstorming session with members of the engagement team. [Issuer A] c. Auditor Communications The Firm's system of quality control appears not to provide sufficient assurance that the Firm will provide to the audit committee, or equivalent, the independence confirmations required by Independence Standards Board Standard No. 1, Independence Discussions with Audit Committees. [Issuers A and B] III. [PCAOB Release No. 104-2011-022A; December 16, 2010] a. Testing Appropriate to the Audit The Firm's system of quality control appears not to provide sufficient assurance that the Firm will conduct all testing appropriate to a particular audit, specifically with respect to the following issues: (i) Valuation and Useful Lives of Intangible Assets As discussed above, in the one audit reviewed, the inspection team identified a significant deficiency related to the Firm's audit procedures with respect to the valuation and useful lives of intangible assets. This information provides cause for concern regarding the Firm's quality control policies and procedures related its audit procedures for testing intangible assets. [Issuer A] (ii) Revenue As discussed above, in the one audit reviewed, the inspection team identified a significant deficiency related to the firm's audit procedures with respect to revenue. This information provides cause for concern regarding the Firm's quality control policies and procedures related to its auditing of revenue. [Issuer A] (iii) Inventory As discussed above, in the one audit reviewed, the inspection team identified a significant deficiency related to inventory valuation and cutoff. This information provides cause for concern regarding the Firm's quality control policies and procedures related to its auditing of inventory. [Issuer A] b. Audit Documentation The Firm's system of quality control appears not to provide sufficient assurance that the Firm will comply with the audit documentation and retention rules set forth in AS No. 3, in that the Firm's policies and procedures do not address the requirement that a

Page 42: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

42

complete and final set of audit documentation be assembled for retention not more than 45 days after the report release date. For the engagement reviewed, the inspection team noted that work papers related to auditing of revenue were not included in the audit documentation. [Issuer A] c. Concurring Partner Review The Firm's system of quality control appears not to provide sufficient assurance that the Firm will comply with applicable standards regarding the performance of a review by a concurring partner. PCAOB Rule 3400T(b)10/ provides that the concurring partner should not have responsibility for the audit of any significant areas. PCAOB standards also provide that a prior engagement partner should not serve as the concurring partner reviewer for at least two annual audits following his or her last year as the audit engagement partner. The firm's concurring partner acted as the manager on the audit engagement, performed audit procedures regarding revenue, inventory, and intangible assets, and was the previous year's audit engagement partner. d. Technical Competence The audit performance deficiencies discussed in Part II.A suggest that the Firm's system of quality control may not provide sufficient assurance that audit partners and staff have the level of knowledge and the degree of technical training and proficiency required in the circumstances and that they refer to authoritative literature or other sources and consult with knowledgeable individuals (within or outside the Firm) when appropriate.

Page 43: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

43

Appendix B: Two examples of Quality Control Criticisms made public by an

AICPA peer review: Firm No: 5358149

Page 44: [Accepted for presentation at the 2012 Deloitte …web.ku.edu/~audsymp/myssi/_pdf/Watching the...E-mail: Srini.Ragothaman@usd.edu [Accepted for presentation at the 2012 Deloitte Foundation

44

Firm No: 10091444