Key Takeaway

Past performance can be useful—but only when it’s credible, comparable, and explainable. A correct evaluation approach starts with data integrity (GIGO), then tests comparability (same methods/standards), and finally validates repeatability drivers (philosophy, process, people). Without these, performance numbers can mislead more than they inform.

Post Content

The past is not a guarantee of similar future outcomes.

We all see the disclaimer—yet most investors still overweight past returns when choosing who to manage their money.

So should you ignore performance history entirely? No. Past performance can be useful—but only if you evaluate it correctly.

A more honest framing is:
Past superior performance does not guarantee future superior performance—but weak or persistently poor outcomes can be a warning sign, especially when paired with weak controls, inconsistent process, or unclear reporting. (Source: CIPM curriculum reference listed below.)

Before you rush to compare returns, there are three traps you must avoid—and a simple checklist you can use immediately.

The 3 Traps That Break Most Performance Evaluations

1) Garbage In, Garbage Out (GIGO)

If the inputs are unreliable, the outputs are misleading—no matter how impressive the chart looks.

What to check:

  • Are valuations timely and consistent?
  • Are cash flows handled correctly?
  • Are fees treated consistently (gross vs net)?
  • Are returns calculated using a method that matches the strategy?

Investor translation: If the data is messy, the performance is noise.

2) Numbers Without the “Repeatability Drivers”

Even real performance can be accidental. What matters is whether the firm can repeat outcomes through time.

Repeatability drivers:

  • Philosophy: What edge do they claim and why should it persist?
  • Process: How is that philosophy executed day-to-day?
  • People: Who makes decisions, and what happens if key people leave?

Investor translation: A strong process can survive a bad quarter. A weak process can’t be trusted after a good one.

3) Comparing Managers Without Comparable Methods

Comparing two managers is meaningless if the performance wasn’t built the same way.

What to check:

  • Are results presented consistently across time?
  • Are composites or peer groupings defined consistently?
  • Are disclosures complete enough to interpret the numbers?

A practical solution is to favor managers that follow a recognized performance reporting standard such as GIPS® (Global Investment Performance Standards), which is designed to support fair representation and full disclosure.

The Practical “Do This Now” Checklist (Manager Selection)

Use this as your quick performance evaluation workflow:

Step 1 — Confirm credibility of the numbers

Ask the manager:

  • “How do you calculate returns (time-weighted vs money-weighted) and why?”
  • “Are returns shown gross and/or net of fees—and what fees are included?”
  • “How do you value portfolios and handle cash flows?”

If answers are unclear or inconsistent, stop and dig deeper.

Step 2 — Confirm comparability

Ask for:

  • a consistent track record presentation (not cherry-picked periods)
  • benchmark approach and rationale
  • disclosures that explain what’s in the strategy and what’s excluded

If they claim GIPS: ask:

  • “Do you claim GIPS compliance?”
  • “Have you been independently verified (optional but credibility-enhancing)?”
  • “Can you share a compliant presentation for the relevant composite?”

Step 3 — Test repeatability

Ask:

  • “What’s your investment philosophy in one sentence?”
  • “What must be true for your strategy to work?”
  • “What are your risk limits and sell discipline?”
  • “How do you ensure consistency across analysts/PMs?”
  • “What is your succession plan / key person risk?”

Step 4 — Look for alignment and operational strength

Because performance doesn’t exist in a vacuum, also ask:

  • “Who oversees performance reporting and error correction?”
  • “How do you handle mistakes when they occur?”
  • “What controls ensure the numbers stay consistent over time?”

The Bottom Line

Performance history is useful only when it’s:

  1. built on reliable inputs (GIGO check),
  2. comparable across managers (methods + disclosure), and
  3. supported by repeatable drivers (philosophy, process, people).

If you skip those steps, you’re not evaluating performance—you’re evaluating marketing.

What’s Next

In my subsequent posts, I’ll break down:

  • the most common performance “red flags” investors miss,
  • questions to ask in due diligence meetings, and
  • how standards like GIPS® reduce comparability risk.

Discussion prompt

  • As an investor, what factors do you rely on when selecting a manager or mutual fund—and has that factor actually worked for you over time?
    As an investment manager, what questions do allocators ask most often, and how do you answer them?

Reference

CIPM 2014 Principles Curriculum – Chapter 2 in Essays on Manager Selection by Scott D. Stewart, PhD, CFA (pp. 601 & 604).

STAY IN THE LOOP

Subscribe to our free newsletter.

Enter Your First Name

 

Leave A Comment