USPAP Evidence Requirements: A Practical Checklist for M&E Appraisers

The Call You Don't Want to Get

It's a Tuesday afternoon, six months after you submitted a machinery appraisal for a manufacturing lender. Your phone rings. The bank's review appraiser wants to discuss Asset #23 — a 2018 Mazak INTEGREX i-200 CNC turning center you valued at $185,000 OLV.

"Can you walk me through your comps for that one?"

You remember the machine. You remember the value felt right. But can you pull up the three auction results you referenced, the screenshots you captured, the dates those listings were active, and the adjustments you made for condition and configuration?

If you can produce that in five minutes, you're in great shape. If it takes you two hours of digging through old folders, you have a workfile problem.

What USPAP Actually Requires

Standards Rule 7-2 requires personal property appraisers to develop an opinion of value through accepted valuation approaches and methods. Standards Rule 8-2 governs reporting. But the workfile requirement — often overlooked until it matters — is where most M&E appraisers have gaps.

USPAP defines the workfile as documentation necessary to support the appraiser's analyses, opinions, and conclusions. The standard is clear: another appraiser should be able to review your workfile and understand what you did, how you did it, and why.

"Credible" isn't just about arriving at a defensible number. It's about the trail you leave behind.

The Checklist

Print this out. Tape it to your monitor. Run through it before you submit every report.

Photography Standards

  • Minimum 3 photos per significant asset: overall view, nameplate/data plate, condition detail
  • Nameplate photos must be readable: if you can't read the serial number in the photo, reshoot it or supplement with a typed note
  • EXIF data intact: don't strip metadata from your photos — it proves when and where they were taken
  • Wide shots of each area/building: establishes context for asset locations
  • Condition evidence: capture wear patterns, damage, modifications, or upgrades that informed your rating
  • Photo-to-asset mapping: every photo should be traceable to a specific asset in your workfile

How many photos is enough? More than you think. A reviewer won't question why you took 8 photos of a $300,000 press. They will question why you have zero photos of a $50,000 compressor you valued based on "inspection."

Serial Number Documentation

  • Photograph every nameplate you can access: even partial serial numbers are better than none
  • Record format exactly as shown: don't "clean up" serial numbers by removing dashes or spaces — transcribe what's on the plate
  • Note when serial numbers are inaccessible: "S/N plate obscured by guarding" is a legitimate workfile entry. A blank field raises questions
  • Cross-reference against manufacturer records when possible: this catches transcription errors and can confirm year of manufacture
  • Flag duplicate or suspicious serial numbers: two assets with identical serials means something went wrong

Comp Source Documentation

This is where most workfiles fall apart. You found three comps six months ago. Can you prove it today?

  • Screenshot every comp at time of use: URLs die, listings expire, auction results get archived. A screenshot is permanent
  • Record the date you accessed each comp: a comp from a 2024 auction used in a 2026 appraisal needs context
  • Document the source: auction house name, dealer name, listing platform, private sale (and how you verified it)
  • Note condition and configuration differences: "Subject has 4th axis, comp did not — adjusted +12%" is exactly what a reviewer wants to see
  • Save full listing details, not just the price: specs, photos, seller descriptions all support your adjustment rationale
  • Maintain at least 2-3 comps per significant asset: a single comp is a data point, not a market analysis

Condition Rating Consistency

  • Use a defined rating scale and stick to it: whether it's 1-5, Poor/Fair/Good/Excellent, or percentage remaining useful life — pick one and use it everywhere
  • Document your scale in the report: reviewers shouldn't have to guess what "Good" means to you
  • Rate consistently across assets: if two machines show similar wear, they should get similar ratings regardless of which one you inspected first or last
  • Note specific observations that drove the rating: "Good — minor surface rust on base, all axes move freely, recent way cover replacement" beats "Good"
  • Photograph condition evidence: your rating is an opinion, but the photo is a fact

Chain of Custody for Digital Files

  • Original photos preserved: keep the originals, even if you crop or resize copies for the report
  • File naming convention: something traceable — Asset23_Mazak_INTEGREX_nameplate.jpg beats IMG_4872.jpg
  • Backup strategy: if your laptop dies tomorrow, can you reconstruct your workfile?
  • Version control on the report: if you issued a revision, both versions should exist in your workfile with clear dating
  • Engagement documentation: scope of work letter, client correspondence, any limiting conditions discussed

Report Version Control

  • Date every draft: not just the final report — if you shared a preliminary value with the client, that communication is part of your workfile
  • Track changes between versions: if Asset #23 changed from $185,000 to $172,000 between draft and final, document why
  • Archive the final as-submitted version: preferably as a locked PDF with the submission date in the filename

Common Gaps That Trip Up Appraisers

After years in this business, I've seen the same gaps come up repeatedly during reviews:

Comp documentation decay. You had solid comps when you wrote the report. But you saved a URL instead of a screenshot, and the listing is gone. Now you have a value with no visible support.

Inconsistent condition ratings across team members. You rate condition on a 1-10 scale. Your associate uses Poor/Fair/Good/Excellent. You combine the workbooks and now the report has two different systems.

Missing photos for lower-value assets. You photographed every major machine but skipped the $5,000 items. A reviewer doesn't know which assets you actually inspected versus which you valued from a list.

No methodology documentation for grouped items. You valued 200 hand tools as a lot at $15,000. How? What methodology? What sampling approach? This is where many workfiles go silent.

Future-Proof Your Evidence Trail

USPAP requires a minimum five-year workfile retention period, and many states require longer. Some lenders specify seven or ten years. The question isn't just "can I produce this today?" but "can I produce this in 2031?"

Three practices that help:

Capture at the point of work. Don't plan to "organize everything later." Link photos to assets, save comp screenshots, and note condition observations while you're in the field or actively researching. Later rarely comes.

Store evidence with the engagement, not scattered across your system. One folder per engagement, everything inside it. Not photos in one place, comps in another, and correspondence in your email.

Always capture more than you think you need. You will never regret having an extra photo of a nameplate. You will absolutely regret not having one when a reviewer asks for it.

The goal isn't perfection. It's a workfile that tells a complete story — one that holds up whether someone asks about it next week or next decade.


Jared Lukes is the founder of Appraisal Dream, building AI tools to help M&E appraisers work faster without cutting corners.

Stay in the loop

Get practical M&E appraisal insights delivered to your inbox.

No spam. Unsubscribe anytime.

You’re subscribed! We’ll be in touch.