Most requests for proposals (RFPs) for electronic monitoring and GPS ankle monitor programs still wave at “NIJ compliance” the way older solicitations once waved at “best commercial practices”—as a phrase that signals seriousness without pinning down measurable acceptance criteria. Having sat on both sides of the table—as a former corrections procurement reviewer and as a technology advisor to agencies modernizing supervision stacks—I have watched too many awards go to whoever had the glossiest brochure, not whoever could produce a test report that matched the language in the solicitation.

This article is different. It is a practical procurement checklist derived from the testable performance themes in NIJ Standard 1004.00 for offender tracking systems (OTS). It is not a substitute for reading the official standard, which remains the authoritative document; it is a working document you can paste into evaluation spreadsheets, attach to RFP attachments, and use in oral presentations when vendors hand-wave about “meeting NIJ.” For program context on certification mechanics, start with our NIJ Standard 1004.00 certification guide; for architecture tradeoffs, see one-piece vs. multi-piece GPS ankle monitors under NIJ.

Agencies should consider that future procurements shall meet, or exceed, the most recent version of this standard.

— NIJ Standard 1004.00, Foreword

According to the National Institute of Justice (NIJ), Standard 1004.00 establishes minimum criminal justice performance requirements for OTS used in community supervision. When your evaluation committee treats those requirements as line items—not slogans—you reduce the risk of buying hardware that fails in the first heat wave, floods your command center with ambiguous tamper traffic, or cannot produce court-ready exports when defense counsel requests discovery.

Understanding “NIJ compliant” vs. “NIJ certified”

Procurement counsel should insist on precise language. In practice, vendors use three phrases interchangeably; they are not equivalent.

“NIJ compliant” usually means the manufacturer self-declares that its design meets the standard’s requirements. Self-attestation has a place in early market scans, but it is not forensically equivalent to independent verification. Ask what evidence supports each claim—test method, lab accreditation, date of test, firmware revision, and sample size.

“NIJ certified” (or certification-aligned language) implies that an independent third-party testing laboratory has evaluated the product against the standard’s methods and that the outcome is documented in a formal report. The companion document NIJ CR-1004.00 defines certification program requirements; buyers should request the NIJ certification number or the full lab report package, not a one-page marketing summary.

If a vendor refuses to share lab documentation under a nondisclosure agreement, negotiate read-only access for your technical evaluator or an independent expert. If they still refuse, score that requirement as “not demonstrated.” Our certification guide walks through how certification fits alongside agency pilot testing.

The complete procurement checklist (by performance category)

Use the lists below as pass/fail or scored criteria. Map each item to the vendor’s test report section; where the standard allows passive vs. active reporting modes, capture which mode your deployment will use before you lock response-time requirements.

A. Safety (five requirements)

  • Radio regulatory compliance: Request the FCC certification grant (or equivalent for your jurisdiction) for intentional radiators and any modular approvals tied to the exact SKU you will receive.
  • Product safety: Require evidence of UL 60950-1 (or successor harmonized safety standards where applicable) for information technology equipment, including any charging accessories bundled in the award.
  • Battery safety: Battery packs should be tested to IEC 62133, UL 1642, or UL 2054 as appropriate to cell and pack construction—especially important for devices charged daily on kitchen counters and patrol desks.
  • Sharp edges: Confirm the UL 1439 sharp edge evaluation passed for wearable surfaces that contact skin during installation, wear, and removal.
  • Emergency removal: Demonstrate emergency removal within one minute using medical scissors without proprietary tools—your medical and field-safety reviewers care about this even when IT does not.

These items rarely appear in flashy demos, yet they surface in liability reviews and county risk pools. Treat them as gate criteria, not “appendix material.”

B. Technical operation (twelve requirements)

  • GPS cold-start acquisition: ≤ 2 minutes to first fix under the standard’s defined cold-start condition.
  • Outdoor positioning accuracy: ≤ 10 m at 90% confidence in outdoor test geometry—see our deep dive on GPS accuracy standards (10 m / 30 m) for how agencies should interpret confidence intervals in real cities.
  • Indoor positioning accuracy: ≤ 30 m at 90% confidence where indoor scenarios apply; align expectations with auxiliary positioning (Wi-Fi, LBS) if used.
  • Movement detection: Pass the movement-detection scenario (e.g., 75 m displacement as specified) with alerts generated per the standard’s timing rules.
  • On-demand location: For active reporting configurations, on-demand fixes within 3 minutes.
  • Low-battery alerting: Local participant alert and data-center alert within 3 minutes of the defined low-battery condition.
  • Zone violations: Alerts within 4 minutes for active modes or 15 minutes for passive modes, per the standard’s split—details in our zone violation alerts and response time article.
  • Data retention: Store at least 10 days of data at one point per minute, recoverable after battery exhaustion (non-volatile behavior as specified).
  • Battery endurance profile: Demonstrate capture of 95% of 1,440 daily points across defined three-temperature operating conditions.
  • Cycle life: Battery qualified for 365 charge/discharge cycles while maintaining required runtime—see battery performance and 365-cycle testing.
  • Charging indication: Clear indication of charging and charge complete states for both participant and supervisor workflows.
  • Participant alerts: Audible alert ≥ 40 dBA at 1 m or vibratory alert ≥ 5 mm/s peak particle velocity, as specified.

C. Circumvention resistance (core plus multi-piece extensions)

  • Strap cutting: Detection within 5 seconds; supervisory alert path within 3 minutes—cross-check methodology with anti-tamper: strap cutting and stretching.
  • Strap stretching: Resistance to 245 N with alerting on excess stretch per the standard’s criteria.
  • Loss of location: Detection and alerting when valid fixes are not available under defined conditions.
  • Communications loss: Detection and alerting after ≥ 1 hour of loss, per the standard’s framework.
  • Multi-piece systems — wireless link: Encrypted wireless using FIPS 197 and FIPS 140-2 validated modules where applicable to the architecture.
  • Multi-piece systems — separation: Local separation alert within 5 minutes; agency notification within 4 minutes of the local alert.

D. Software (eight requirements)

  • Evidence exports: CSV export including location histories, alerts, offender ID, and agency metadata sufficient for discovery and audit.
  • Collection rate: Configurable reporting interval down to at least one point per minute.
  • Upload cadence: Active upload at least every 15 minutes under defined operating modes.
  • Geofence geometry: Support circular, rectangular, and free-form polygons with 40+ nodes where free-form is offered.
  • Zone templates: Ability to maintain 50+ zones per template for complex schedules (school, work, treatment).
  • Audit trail: Immutable or append-only audit trail for configuration and case changes with employee identification.
  • Access control: Role-based access control aligned to least-privilege supervision workflows.
  • Authentication: Secure login meeting Appendix A criteria (password policy, lockout, session controls as specified).

Software requirements are where “NIJ” RFPs often fall apart: the hardware team scores the bracelet while the SaaS portal ships without CSV exports or without per-officer audit trails. Evaluate system of record behavior as one bid. Our software, security, and encryption under NIJ article expands on encryption documentation.

E. Robustness (nine environmental and mechanical themes)

  • Temperature: Operation/storage extremes across the standard’s −20 °C to +50 °C framework (exact clauses per official test methods).
  • Condensing humidity: No moisture ingress; functional after exposure.
  • Water spray: No insulation breakdown; functional after exposure and after the standard’s one-week latency window.
  • Immersion: No insulation breakdown; functional after exposure and after the one-week latency window.
  • Impact shock: No casing breach under defined impact protocol.
  • Dynamic shock: Remains functional after defined drop/shock exposure.
  • Sinusoidal vibration: Functional after MIL-STD-810G-class sinusoidal profile.
  • Random vibration: Functional after random vibration profile.
  • EMC: Functional during and after IEC 61000-4 family immunity exposures as cited in the standard.

Environmental claims should be read alongside ingress ratings and logistics reality; our environmental durability testing article explains spray, immersion, and EMC in procurement language.

F. Optional advanced features (require explicit vendor declaration)

NIJ 1004.00 includes optional capabilities that may matter intensely for high-risk caseloads. If you need them, write them as scored requirements—not footnotes.

  • Metallic shielding detection: Detection within 5 minutes with no false alerts under the standard’s test framing.
  • Cellular jamming detection: Same 5-minute detection expectation and no false-alert discipline.
  • GPS jamming detection: Same pattern for GNSS interference scenarios.

Operational nuance and test limits are discussed in GPS and cellular jamming detection (optional NIJ features).

Questions to ask every vendor (and what a good answer sounds like)

  • “Has your device been tested by an independent lab to NIJ 1004.00?” Good answer: yes, with lab name, accreditation, report date, firmware version, and certification identifier if applicable. Weak answer: “we follow NIJ guidelines internally.”
  • “What is your outdoor GPS accuracy in real-world deployments?” Good answer: distribution plots or third-party validation, not just open-sky CEP. Tie answers to the 90% confidence framing your RFP uses.
  • “What is your false tamper alert rate in the field?” Good answer: definition of “false,” numerator/denominator, cohort size, and mitigation firmware roadmap.
  • “How many charge cycles does your battery support before replacement?” Good answer: data supporting 365-cycle qualification or honest alternative with service interval math.
  • “Do you support FIPS-validated encryption for all data in transit?” Good answer: module certificates, cipher suites, and where endpoints terminate TLS.
  • “Can we export all historical data as CSV for court evidence?” Good answer: sample export, chain-of-custody features, and timestamp/time-zone handling.

Red flags in vendor responses

  • Claiming “NIJ compliant” without providing test reports or mapping claims to specific clauses.
  • Quoting accuracy only in ideal open-sky conditions when your county includes urban canyons and dense multifamily housing.
  • Unable to state strap-cut detection and alert times in seconds and minutes respectively.
  • No FIPS documentation for cryptographic modules where the standard requires validated cryptography in multi-piece links.
  • No evidence of 365-cycle battery qualification when your solicitation assumes a multi-year cost model.

Beyond NIJ: additional procurement considerations

NIJ 1004.00 is a criminal-justice performance floor, not a complete international type-approval package. Most agencies still layer additional evidence:

  • EU CE marking and RED for radio equipment placed on the European market—often bundled with structured EMC and RF exposure documentation.
  • Ingress protection such as IP68 for continuous wear through showers and weather exposure (map IP claims to IEC 60529 depth/duration tables).
  • Multi-constellation GNSS (GPS, Galileo, GLONASS, BeiDou) for urban availability—not always spelled out in older RFPs but operationally decisive.
  • Cellular IoT such as LTE-M and NB-IoT for power-balanced reporting in markets phasing legacy 2G/3G service.
  • Cybersecurity type evaluation such as EN 18031-series expectations for connected radio equipment in European regulatory contexts—complementary to Appendix A login controls.
  • Architecture choice: One-piece integrated anklets vs. multi-piece hub/bracelet designs trade battery volume, tamper complexity, and FIPS scope; revisit one-piece vs. multi-piece before you finalize weighting in your scoring model.

For equipment due diligence beyond the standard itself, commercial teams often pair this checklist with third-party buyer guides—see GPS ankle monitor buyer’s guide and, for one-piece hardware context, CO-EYE ONE product overview on our primary equipment site.

NIJ 1004.00 series on Ankle Monitor

This checklist is the capstone to a nine-part technical series. Use these internal references when briefing evaluators:

Frequently asked questions

Can we require NIJ certification instead of self-declared compliance?

Yes, provided your solicitation language matches program authority and funding rules. Specify whether you require full certification, independent lab testing to the standard’s methods, or agency witness testing. Always cite the standard version date your legal team approves.

How should we score passive vs. active reporting modes?

Score against the mode you will operate in production. NIJ 1004.00 uses different time bounds for some alerts (for example, zone violations). Mixing modes without documentation yields unfair vendor comparisons.

What if a vendor meets NIJ but lacks CSV exports?

They do not meet your operational needs for discovery and program audit. Treat export requirements as go/no-go if your counsel mandates flat-file production.

Are optional jamming features mandatory for every program?

No—they are optional under the standard. High-risk caseloads may still mandate them by policy; cite our jamming feature article when drafting justification memos.

How does this checklist relate to state contract vehicles?

State master agreements often pre-qualify vendors. Use this checklist as a gap analysis against the master contract’s technical attachment; do not assume statewide awards refreshed every clause when NIJ updates.