Skip to Main Content

insightsarticles

Internal audit potential for
not-for-profit
organizations

03.04.20

Editor’s note: Please read this if you are a not-for-profit board member, CFO, or any other decision maker within a not-for-profit.

In a time where not-for-profit (NFP) organizations struggle with limited resources and a small back office, it is important not to overlook internal audit procedures. Over the years, internal audit departments have been one of the first to be cut when budgets are tight. However, limited resources make these procedures all the more important in safeguarding the organization’s assets. Taking the time to perform strategic internal audit procedures can identify fraud, promote ethical behavior, help to monitor compliance, and identify inefficiencies. All of these lead to a more sustainable, ethical, and efficient organization. 

Internal audit approaches

The internal audit function can take on many different forms, depending on the size of the organization. There are options between the dedicated internal audit department and doing nothing whatsoever. For example:

  • A hybrid approach, where specific procedures are performed by an internal team, with other procedures outsourced. 
  • An ad hoc approach, where the board or management directs the work of a staff member.

The hybrid approach will allow the organization to hire specialists for more technical tasks, such as an in-depth financial analysis or IT risk assessment. It also recognizes internal staff may be best suited to handle certain internal audit functions within their scope of work or breadth of knowledge. This may add costs but allows you to perform these functions otherwise outside of your capacity without adding significant burden to staff. 

The ad hoc approach allows you to begin the work of internal audit, even on a small scale, without the startup time required in outsourcing the work. This approach utilizes internal staff for all functions directed by the board or management. This leads to the ad-hoc approach being more budget friendly as external consultants don’t need to be hired, though you will have to be wary of over burdening your staff.

With proper objectivity and oversight, you can perform these functions internally. To bring the process to your organization, first find a champion for the project (CFO, controller, compliance officer, etc.) to free up staff time and resources in order to perform these tasks and to see the work through to the end. Other steps to take include:

  1. Get the audit/finance committee on board to help communicate the value of the internal audit and review results of the work
  2. Identify specific times of year when these processes are less intrusive and won’t tax staff 
  3. Get involved in the risk management process to help identify where internal audit can best address the most significant risks at the organization
  4. Leverage others who have had success with these processes to improve process and implementation
  5. Create a timeline and maintain accountability for reporting and follow up of corrective actions

Once you have taken these steps, the next thing to look at (for your internal audit process) is a thoughtful and thorough risk assessment. This is key, as the risk assessment will help guide and focus the internal audit work of the organization in regard to what functions to prioritize. Even a targeted risk assessment can help, and an organization of any size can walk through a few transaction cycles (gift receipts or payroll, for example) and identify a step or two in the process that can be strengthened to prevent fraud, waste, and abuse.  

Here are a few examples of internal audit projects we have helped clients with:

  • Payroll analysis—in-depth process mapping of the payroll cycle to identify areas for improvement
  • Health and education facilities performance audit—analysis of various program policies and procedures to optimize for compliance
  • Agreed upon procedures engagement—contract and invoice/timesheet information review to ensure proper contractor selection and compliant billing and invoicing procedures 

Internal audits for companies of all sizes

Regardless of size, your organization can benefit from internal audit functions. Embracing internal audit will help increase organizational resilience and the ability to adapt to change, whether your organization performs internal audit functions internally, outsources them, or a combination of the two. For more information about how your company can benefit from an internal audit, or if you have questions, contact us

Related Services

Accounting and Assurance

Consulting

Business Advisory

Organizational and Governance

Related Professionals

Leaders

BerryDunn experts and consultants

Read this if you are a police executive, city/county administrator, or elected government official, responsible for a law enforcement agency. 

“We need more cops!”  

Do your patrol officers complain about being short-staffed or too busy, or that they are constantly running from call to call? Does your agency struggle with backed-up calls for service (CFS) or lengthy response times? Do patrol staff regularly find themselves responding to another patrol area to handle a CFS because the assigned officer is busy on another call? Are patrol officers denied leave time or training opportunities because of staffing issues? Does the agency routinely use overtime to cover predictable shift vacancies for vacations, holidays, or training? 

If one or more of these concerns sound familiar, you may need additional patrol resources, as staffing levels are often a key factor in personnel deployment challenges. Flaws in the patrol schedule design may also be responsible, as they commonly contribute to reduced efficiency and optimal performance, and design issues may be partially responsible for some of these challenges, regardless of authorized staffing levels.
 
With community expectations at an all-time high, and resource allocations remaining relatively flat, many agencies have growing concerns about managing increasing service volumes while controlling quality and building/maintaining public trust and confidence. Amid these concerns, agencies struggle with designing work schedules that efficiently and optimally deploy available patrol resources, as patrol staff become increasingly frustrated at what they consider a lack of staff.

The path to resolving inefficiencies in your patrol work schedule and optimizing the effective deployment of patrol personnel requires thoughtful consideration of several overarching goals:

  • Reducing or eliminating predictable overtime
  • Eliminating peaks and valleys in staffing due to scheduled leave
  • Ensuring appropriate staffing levels in all patrol zones or beats
  • Providing sufficient staff to manage multiple and priority CFS in patrol zones or beats
  • Satisfying both operational and staff needs, including helping to ensure a proper work/life balance and equitable workloads for patrol staff

Scheduling alternatives

One common design issue that presents an ongoing challenge for agencies is the continued use of traditional, balanced work schedules, which spread officer work hours equally over the year. Balanced schedules rely on over-scheduling and overtime to manage personnel allocation and leave needs and, by design, are very rigid. Balanced work schedules have been used for a very long time, not because they’re most efficient, but because they’re common, familiar, and easily understood―and because patrol staff are comfortable with them (and typically reluctant to change). However, short schedules offer a proven alternative to balanced patrol work schedules, and when presented with the benefits of an alternative work schedule design (e.g., increased access to back-up, ease of receiving time off or training, consistency in staffing, less mandatory overtime), many patrol staff are eager to change.

Short schedules

Short schedules involve a more contemporary design that includes a flexible approach that focuses on a more adaptive process of allocating personnel where and when they are needed. They are significantly more efficient than balanced schedules and, when functioning properly, they can dramatically improve personnel deployments, bring continuity to daily staffing, and reduce overtime, among other operational benefits. Given the current climate, most agencies are unlikely to receive substantial increases in personnel allocations. If that is true of your agency, it may be time to explore the benefits of alternative patrol work schedules.

A tool you can use

Finding scheduling strategies that work in this climate requires an intentional approach, customized to your agency’s characteristics (e.g., staffing levels, geographic factors, crime rates, zone/beat design, contract/labor rules). To help guide you through this process, BerryDunn has developed a free tool for evaluating patrol schedules. Click here to measure your patrol schedule against key design components and considerations.

If you are curious about alternative patrol work schedules, our dedicated justice and public Safety consultants are available to discuss your organization’s needs.

Article
Efficient police patrol work schedules―By design

Read this if you are an Institutional Research (IR) Director, a Registrar, or are in the C-Suite.

In my last blog, I defined the what and the why of data governance, and outlined the value of data governance in higher education environments. I also asserted data isn’t the problem―the real culprit is our handling of the data (or rather, our deferral of data responsibility to others).

While I remain convinced that data isn’t the problem, recent experiences in the field have confirmed the fact that data governance is problematic. So much, in fact, that I believe data governance defies a “solid,” point-in-time solution. Discouraged? Don’t be. Just recalibrate your expectations, and pursue an adaptive strategy.

This starts with developing data governance guiding principles, with three initial points to consider: 

  1. Key stakeholders should develop your institution’s guiding principles. The team should include representatives from areas such as the office of the Registrar, Human Resources, Institutional Research, and other significant producers and consumers of institutional data. 
  2. The focus of your guiding principles must be on the strategic outcomes your institution is trying to achieve, and the information needed for data-driven decision-making.
  3. Specific guiding principles will vary from institution to institution; effective data governance requires both structure and flexibility.

Here are some baseline principles your institution may want to adopt and modify to suit your particular needs.

  • Data governance entails iterative processes, attention to measures and metrics, and ongoing effort. The institution’s governance framework should be transparent, practical, and agile. This ensures that governance is seen as beneficial to data management and not an impediment.
  • Governance is an enabler. The institution’s work should help accomplish objectives and solve problems aligned with strategic priorities.
  • Work with the big picture in mind. Start from the vantage point that data is an institutional asset. Without an institutional asset mentality it’s difficult to break down the silos that make data valuable to the organization.
  • The institution should identify data trustees and stewards that will lead the data governance efforts at your institution
    • Data trustees should have responsibility over data, and have the highest level of responsibility for custodianship of data.
    • Data stewards should act on behalf of data trustees, and be accountable for managing and maintaining data.
  • Data quality needs to be baked into the governance process. The institution should build data quality into every step of capture and entry. This will increase user confidence that there is data integrity. The institution should develop working agreements for sharing and accessing data across organizational lines. The institution should strive for processes and documentation that is consistent, manageable, and effective. This helps projects run smoothly, with consistent results every time.
  • The institution should pay attention to building security into the data usage cycle. An institution’s security measures and practices need to be inherent in the day-to-day management of data, and balanced with the working agreements mentioned above. This keeps data secure and protected for the entire organization.
  •  Agreed upon rules and guidelines should be developed to support a data governance structure and decision-making. The institution should define and use pragmatic approaches and practical plans that reward sustainability and collaboration, building a successful roadmap for the future. 

Next Steps

Are you curious about additional guiding principles? Contact me. In the meantime, keep your eyes peeled for a future blog that digs deeper into the roles of data trustees and stewards.
 

Article
Governance: It's good for your data

LIBOR is leaving—is your financial institution ready to make the most of it?

In July 2017, the UK’s Financial Conduct Authority announced the phasing out of the London Interbank Offered Rate, commonly known as LIBOR, by the end of 20211. With less than two years to go, US federal regulators are urging financial institutions to start assessing their LIBOR exposure and planning their transition. Here we offer some general impacts of the phasing out, some specific actions your institution can take to prepare, and, finally, background on how we got here (see Background at right).

How will the phase-out impact financial institutions?

The Federal Reserve estimates roughly $200 trillion in LIBOR-indexed notional value transactions in the cash and derivatives market2. LIBOR is used to help price a variety of financial services products,  including $3.4 trillion in business loans and $1.3 trillion in consumer loans, as well as derivatives, swaps, and other credit instruments. Even excluding loans and financial instruments set to mature before 2021—estimated by the FDIC at 82% of the above $200 trillion—LIBOR exposure is still significant3.

A financial institution’s ability to lend money is largely dependent on the relative stability of its capital position, or lack thereof. For institutions with a significant amount of LIBOR-indexed assets and liabilities, that means less certainty in expected future cash flows and a less stable capital position, which could prompt institutions to deny loans they might otherwise have approved. A change in expected cash flows could also have several indirect consequences. Criticized assets, assessed for impairment based on their expected future cash flows, could require a specific reserve due to lower present value of expected future cash flows.

The importance of fallback language in loan agreements

Fallback language in loan agreements plays a pivotal role in financial institutions’ ability to manage their LIBOR-related financial results. Most loan agreements include language that provides guidance for determining an alternate reference rate to “fall back” on in the event the loan’s original reference rate is discontinued. However, if this language is non-existent, contains fallbacks that are no longer adequate, or lacks certain key provisions, it can create unexpected issues when it comes time for financial institutions to reprice their LIBOR loans. Here are some examples:

  • Non-existent or inadequate fallbacks
    According to the Alternative Reference Rates Committee, a group of private-market participants convened by the Federal Reserve to help ensure a successful LIBOR transition, "Most contracts referencing LIBOR do not appear to have envisioned a permanent or indefinite cessation of LIBOR and have fallbacks that would not be economically appropriate"4.

    For instance, industry regulators have warned that without updated fallback language, the discontinuation of LIBOR could prompt some variable-rate loans to become fixed-rate2, causing unanticipated changes in interest rate risk for financial institutions. In a declining rate environment, this may prove beneficial as loans at variable rates become fixed. But in a rising rate environment, the resulting shrink in net interest margins would have a direct and adverse impact on the bottom line.

  • No spread adjustment
    Once LIBOR is discontinued, LIBOR-indexed loans will need to be repriced at a new reference rate, which could be well above or below LIBOR. If loan agreements don’t provide for an adjustment of the spread between LIBOR and the new rate, that could prompt unexpected changes in the financial position of both borrowers and lenders3. Take, for instance, a loan made at the Secured Overnight Financing Rate (SOFR), generally considered the likely replacement for USD LIBOR. Since SOFR tends to be lower than three-month LIBOR, a loan agreement using it that does not allow for a spread adjustment would generate lower loan payments for the borrower, which means less interest income for the lender.

    Not allowing for a spread adjustment on reference rates lower than LIBOR could also cause a change in expected prepayments—say, for instance, if borrowers with fixed-rate loans decide to refinance at adjustable rates—which would impact post-CECL allowance calculations like the weighted-average remaining maturity (WARM) method, which uses estimated prepayments as an input.

What can your financial institution do to prepare?

The Federal Reserve and the SEC have urged financial institutions to immediately evaluate their LIBOR exposure and expedite their transition. Though the FDIC has expressed no intent to examine financial institutions for the status of LIBOR planning or critique loans based on use of LIBOR3, Federal Reserve supervisory teams have been including LIBOR transitions in their regular monitoring of large financial institutions5. The SEC has also encouraged companies to provide investors with robust disclosures regarding their LIBOR transition, which may include a notional value of LIBOR exposure2.

Financial institutions should start by analyzing their LIBOR exposure beyond 2021. If you don’t expect significant exposure, further analysis may be unnecessary. However, if you do expect significant future LIBOR exposure, your institution should conduct stress testing using LIBOR as an isolated variable by running hypothetical transition scenarios and assessing the potential financial impact.

Closely examine and assess fallback language in loan agreements. For existing loan agreements, you may need to make amendments, which could require consent from counterparties2. For new loan agreements maturing beyond 2021, lenders should consider selecting an alternate reference rate. New contract language for financial instruments and residential mortgages is currently being drafted by the International Securities Dealers Association and the Federal Housing Finance Authority, respectively3—both of which may prove helpful in updating loan agreements.

Lenders should also consider their underwriting policies. Loan underwriters will need to adjust the spread on new loans to accurately reflect the price of risk, because volatility and market tendencies of alternate loan reference rates may not mirror LIBOR’s. What’s more, SOFR lacks abundant historical data for use in analyzing volatility and market tendencies, making accurate loan pricing more difficult.

Conclusion: Start assessing your LIBOR risk soon

The cessation of LIBOR brings challenges and opportunities that will require in-depth analysis and making difficult decisions. Financial institutions and consumers should heed the advice of regulators and start assessing their LIBOR risk now. Those that do will not only be better prepared―but also better positioned―to capitalize on the opportunities it presents.

Need help assessing your LIBOR risk and preparing to transition? Contact BerryDunn’s financial services specialists.

1 https://www.washingtonpost.com/business/2017/07/27/acdd411c-72bc-11e7-8c17-533c52b2f014_story.html?utm_term=.856137e72385
2 Thomson Reuters Checkpoint Newsstand April 10, 2019
3 https://www.fdic.gov/regulations/examinations/supervisory/insights/siwin18/si-winter-2018.pdf
4 https://bankingjournal.aba.com/2019/04/libor-transition-panel-recommends-fallback-language-for-key-instruments/
5 https://www.reuters.com/article/us-usa-fed-libor/fed-urges-u-s-financial-industry-to-accelerate-libor-transition-idUSKCN1RM25T

Article
When one loan rate closes, another opens

Who has the time or resources to keep tabs on everything that everyone in an organization does? No one. Therefore, you naturally need to trust (at least on a certain level) the actions and motives of various personnel. At the top of your “trust level” are privileged users—such as system and network administrators and developers—who keep vital systems, applications, and hardware up and running. Yet, according to the 2019 Centrify Privileged Access Management in the Modern Threatscape survey, 74% of data breaches occurred using privileged accounts. The survey also revealed that of the organizations responding:

  • 52% do not use password vaulting—password vaulting can help privileged users keep track of long, complex passwords for multiple accounts in an encrypted storage vault.
  • 65% still share the use of root and other privileged access—when the use of root accounts is required, users should invoke commands to inherent the privileges of the account (SUDO) without actually using the account. This ensures “who” used the account can be tracked.
  • Only 21% have implemented multi-factor authentication—the obvious benefit of multi-factor authentication is to enhance the security of authenticating users, but also in many sectors it is becoming a compliance requirement.
  • Only 47% have implemented complete auditing and monitoring—thorough auditing and monitoring is vital to securing privileged accounts.

So how does one even begin to trust privileged accounts in today’s environment? 

1. Start with an inventory

To best manage and monitor your privileged accounts, start by finding and cataloguing all assets (servers, applications, databases, network devices, etc.) within the organization. This will be beneficial in all areas of information security such as asset management, change control and software inventory tracking. Next, inventory all users of each asset and ensure that privileged user accounts:

  • Require privileges granted be based on roles and responsibilities
  • Require strong and complex passwords (exceeding those of normal users)
  • Have passwords that expire often (30 days recommended)
  • Implement multi-factor authentication
  • Are not shared with others and are not used for normal activity (the user of the privileged account should have a separate account for non-privileged or non-administrative activities)

If the account is only required for a service or application, disable the account’s ability to login from the server console and from across the network

2. Monitor—then monitor some more

The next step is to monitor the use of the identified privileged accounts. Enable event logging on all systems and aggregate to a log monitoring system or a Security Information and Event Management (SIEM) system that alerts in real time when privileged accounts are active. Configure the system to alert you when privileged accounts access sensitive data or alter database structure. Report any changes to device configurations, file structure, code, and executable programs. If these changes do not correlate to an approved change request, treat them as incidents and investigate.  

Consider software that analyzes user behavior and identifies deviations from normal activity. Privileged accounts that are accessing data or systems not part of their normal routine could be the indication of malicious activity or a database attack from a compromised privileged account. 

3. Secure the event logs

Finally, ensure that none of your privileged accounts have access to the logs being used for monitoring, nor have the ability to alter or delete those logs. In addition to real time monitoring and alerting, the log management system should have the ability to produce reports for periodic review by information security staff. The reports should also be archived for forensic purposes in the event of a breach or compromise.

Gain further assistance (and peace of mind) 

BerryDunn understands how privileged accounts should be monitored and audited. We can help your organization assess your current event management process and make recommendations if improvements are needed. Contact our team.

Article
Trusting privileged accounts in the age of data breaches

“The world is one big data problem,” says MIT scientist and visionary Andrew McAfee.

That’s a daunting (though hardly surprising) quote for many in data-rich sectors, including higher education. Yet blaming data is like blaming air for a malfunctioning wind turbine. Data is a valuable asset that can make your institution move.

To many of us, however, data remains a four-letter word. The real culprit behind the perceived data problem is our handling and perception of data and the role it can play in our success—that is, the relegating of data to a select, responsible few, who are usually separated into hardened silos. For example, a common assumption in higher education is that the IT team can handle it. Not so. Data needs to be viewed as an institutional asset, consumed by many and used by the institution for the strategic purposes of student success, scholarship, and more.

The first step in addressing your “big” data problem? Data governance.

What is data governance?

There are various definitions, but the one we use with our clients is “the ongoing and evolutionary process driven by leaders to establish principles, policies, business rules, and metrics for data sharing.”

Please note that the phrase “IT” does not appear anywhere in this definition.

Why is data governance necessary? For many reasons, including:

  1. Data governance enables analytics. Without data governance, it’s difficult to gain value from analytics initiatives which will produce inconsistent results. A critical first step in any data analytics initiative is to make sure that definitions are widely accepted and standards have been established. This step allows decision makers to have confidence in the data being analyzed to describe, predict, and improve operations.
     
  2. Data governance strengthens privacy, security, and compliance. Compliance requirements for both public and private institutions constantly evolve. The more data-reliant your world becomes, the more protected your data needs to be. If an organization does not implement security practices as part of its data governance framework, it becomes easier to fall out of compliance. 
     
  3. Data governance supports agility. How many times have reports for basic information (part-time faculty or student FTEs per semester, for example) been requested, reviewed, and returned for further clarification or correction? And that’s just within your department! Now add multiple requests from the perspective of different departments, and you’re surely going through multiple iterations to create that report. That takes time and effort. By strengthening your data governance framework, you can streamline reporting processes by increasing the level of trust you have in the information you are seeking. Understanding the value of data governance is the easy part/ The real trick is implementing a sustainable data governance framework that recognizes that data is an institutional asset and not just a four-letter word.

Stay tuned for part two of this blog series: The how of data governance in higher education. In the meantime, reach out to me if you would like to discuss additional data governance benefits for your institution.

Article
Data is a four-letter word. Governance is not.

Modernization means different things to different people—especially in the context of state government. For some, it is the cause of a messy chain reaction that ends (at best) in frustration and inefficiency. For others, it is the beneficial effect of a thoughtful and well-planned series of steps. The difference lies in the approach to transition - and states will soon discover this as they begin using the new Comprehensive Child Welfare Information System (CCWIS), a case management information system that helps them provide citizens with customized child welfare services.

The benefits of CCWIS are numerous and impressive, raising the bar for child welfare and providing opportunities to advance through innovative technology that promotes interoperability, flexibility, improved management, mobility, and integration. However, taking advantage of these benefits will also present challenges. Gone are the days of the cookie-cutter, “one-size-fits-all” approach. Here are five facts to consider as you transition toward an effective modernization.

  1. There are advantages and challenges to buying a system versus building a system internally. CCWIS transition may involve either purchasing a complete commercial off-the-shelf (COTS) product that suits the state, or constructing a new system internally with the implementation of a few purchased modules. To decide which option is best, first assess your current systems and staff needs. Specifically, consider executing a cost-benefit analysis of options, taking into account internal resource capabilities, feasibility, flexibility, and time. This analysis will provide valuable data that help you assess the current environment and identify functional gaps. Equipped with this information, you should be ready to decide whether to invest in a COTS product, or an internally-built system that supports the state’s vision and complies with new CCWIS regulations.
     
  2. Employ a modular approach to upgrading current systems or building new systems. The Children’s Bureau—an office of the Administration for Children & Families within the U.S. Department of Health and Human Services—defines “modularity” as the breaking down of complex functions into separate, manageable, and independent components. Using this modular approach, CCWIS will feature components that function independently, simplifying future upgrades or procurements because they can be completed on singular modules rather than the entire system. Modular systems create flexibility, and enable you to break down complex functions such as “Assessment and Intake,” “Case Management,” and “Claims and Payment” into modules during CCWIS transition. This facilitates the development of a sustainable system that is customized to the unique needs of your state, and easily allows for future augmentation.
     
  3. Use Organizational Change Management (OCM) techniques to mitigate stakeholder resistance to change. People are notoriously resistant to change. This is especially true during a disruptive project that impacts day-to-day operations—such as building a new or transitional CCWIS system. Having a comprehensive OCM plan in place before your CCWIS implementation can help ensure that you assign an effective project sponsor, develop thorough project communications, and enact strong training methods. A clear OCM strategy should help mitigate employee resistance to change and can also support your organization in reaching CCWIS goals, due to early buy-in from stakeholders who are key to the project’s success.
     
  4. Data governance policies can help ensure you standardize mandatory data sharing. For example, the Children’s Bureau notes that a Title IV-E agency with a CCWIS must support collaboration, interoperability, and data sharing by exchanging data with Child Support Systems?Title IV-D, Child Abuse/Neglect Systems, Medicaid Management Information Systems (MMIS), and many others as described by the Children’s Bureau.

    Security is a concern due to the large amount of data sharing involved with CCWIS systems. Specifically, if a Title IV-E agency with a CCWIS does not implement foundational data security measures across all jurisdictions, data could become vulnerable, rendering the system non-compliant. However, a data governance framework with standardized policies in place can protect data and surrounding processes.
     
  5. Continuously refer to federal regulations and resources. With the change of systems comes changes in federal regulations. Fortunately, the Children’s Bureau provides guidance and toolkits to assist you in the planning, development, and implementation of CCWIS. Particularly useful documents include the “Child Welfare Policy Manual,” “Data Sharing for Courts and Child Welfare Agencies Toolkit,” and the “CCWIS Final Rule”. A comprehensive list of federal regulations and resources is located on the Children’s Bureau website.

    Additionally, the Children’s Bureau will assign an analyst to each state who can provide direction and counsel during the CCWIS transition. Continual use of these resources will help you reduce confusion, avoid obstacles, and ultimately achieve an efficient modernization program.

Modernization doesn’t have to be messy. Learn more about how OCM and data governance can benefit your agency or organization.

Article
Five things to keep in mind during your CCWIS transition

Artificial Intelligence, or AI, is no longer the exclusive tool of well-funded government entities and defense contractors, let alone a plot device in science fiction film and literature. Instead, AI is becoming as ubiquitous as the personal computer. The opportunities of what AI can do for internal audit are almost as endless as the challenges this disruptive technology represents.

To understand how AI will influence internal audit, we must first understand what AI is.The concept of AI—a technology that can perceive the world directly and respond to what it perceives—is often attributed to Alan Turing, though the term “Artificial Intelligence” was coined much later in 1956 at Dartmouth College, in Hanover, New Hampshire. Turing was a British scientist who developed the machine that cracked the Nazis’ Enigma code. Turing thought of AI as a machine that could convince a human that it also was human. Turing’s humble description of AI is as simple as it is elegant. Fast-forward some 60 years and AI is all around us and being applied in novel ways almost every day. Just consider autonomous self- driving vehicles, facial recognition systems that can spot a fugitive in a crowd, search engines that tailor our online experience, and even Pandora, which analyzes our tastes in music.

Today, in practice and in theory, there are four types of AI. Type I AI may be best represented by IBM’s Deep Blue, a chess-playing computer that made headlines in 1996 when it won a match against Russian chess champion Gary Kasparov. Type I AI is reactive. Deep Blue can beat a chess champion because it evaluates every piece on the chessboard, calculates all possible moves, then predicts the optimal move among all possibilities. Type I AI is really nothing more than a super calculator, processing data much faster than the human mind can. This is what gives Type I AI an advantage over humans.

Type II AI, which we find in autonomous cars, is also reactive. For example, it applies brakes when it predicts a collision; but, it has a low form of memory as well. Type II AI can briefly remember details, such as the speed of oncoming traffic or the distance between the car and a bicyclist. However, this memory is volatile. When the situation has passed, Type II AI deletes the data from its memory and moves on to the next challenge down the road.

Type II AI's simple form of memory management and the ability to “learn” from the world in which it resides is a significant advancement. 
The leap from Type II AI to Type III AI has yet to occur. Type III AI will not only incorporate the awareness of the world around it, but will also be able to predict the responses and motivations of other entities and objects, and understand that emotions and thoughts are the drivers of behavior. Taking the autonomous car analogy to the next step, Type III AI vehicles will interact with the driver. By conducting a simple assessment of the driver’s emotions, the AI will be able to suggest a soothing playlist to ease the driver's tensions during his or her commute, reducing the likelihood of aggressive driving. Lastly, Type IV AI–a milestone that will likely be reached at some point over the next 20 or 30 years—will be self-aware. Not only will Type IV AI soothe the driver, it will interact with the driver as if it were another human riding along for the drive; think of “HAL” in Arthur C. Clarke’s 2001: A Space Odyssey.

So what does this all mean to internal auditors?
While it may be a bit premature to predict AI’s impact on the internal audit profession, AI is already being used to predict control failures in institutions with robust cybersecurity programs. When malicious code is detected and certain conditions are met, AI-enabled devices can either divert the malicious traffic away from sensitive data, or even shut off access completely until an incident response team has had time to investigate the nature of the attack and take appropriate actions. This may seem a rather rudimentary use of AI, but in large financial institutions or manufacturing facilities, minutes count—and equal dollars. Allowing AI to cut off access to a line of business that may cost the company money (and its reputation) is a significant leap of faith, and not for the faint of heart. Next generation AI-enabled devices will have even more capabilities, including behavioral analysis, to predict a user’s intentions before gaining access to data.

In the future, internal audit staff will no doubt train AI to seek conditions that require deeper analysis, or even predict when a control will fail. Yet AI will be able to facilitate the internal audit process in other ways. Consider AI’s role in data quality. Advances in inexpensive data storage (e.g., the cloud) have allowed the creation and aggregation of volumes of data subject to internal audit, making the testing of the data’s completeness, integrity, and reliability a challenging task considering the sheer volume of data. Future AI will be able to continuously monitor this data, alerting internal auditors not only of the status of data in both storage and motion, but also of potential fraud and disclosures.

The analysis won’t stop there.
AI will measure the performance of the data in meeting organizational objectives, and suggest where efficiencies can be gained by focusing technical and human resources to where the greatest risks to the organization exist in near real-time. This will allow internal auditors to develop a common operating picture of the day-to-day activities in their business environments, alerting internal audit when something doesn’t quite look right and requires further investigation.

As promising as AI is, the technology comes with some ethical considerations. Because AI is created by humans, it is not always vacant of human flaws. For instance, AI can become unpredictably biased. AI used in facial recognition systems has made racial judgments based on certain common facial characteristics. In addition, AI that gathers data from multiple sources that span a person’s financial status, credit status, education, and individual likes and dislikes could be used to profile certain groups for nefarious intentions. Moreover, AI has the potential to be weaponized in ways that we have yet to comprehend.

There is also the question of how internal auditors will be able to audit AI. Keeping AI safe from internal fraudsters and external adversaries is going to be paramount. AI’s ability to think and act faster than humans will challenge all of us to create novel ways of designing and testing controls to measure AI’s performance. This, in turn, will likely make partnerships with consultants that can fill knowledge gaps even more valuable. 

Challenges and pitfalls aside, AI will likely have a tremendous positive effect on the internal audit profession by simultaneously identifying risks and evaluating processes and control design. In fact, it is quite possible that the first adopters of AI in many organizations may not be the cybersecurity departments at all, but rather the internal auditor’s office. As a result, future internal auditors will become highly technical professionals and perhaps trailblazers in this new and amazing technology.

Article
Artificial intelligence and the future of internal audit