Limited Time Offer. Become a Founder Member Now!

OPEGA briefing finds gaps in Department of Juvenile Services' community services, data and oversight

January 15, 2025 | Judicial Proceedings Committee, SENATE, SENATE, Committees, Legislative, Maryland


This article was created by AI summarizing key points discussed. AI makes mistakes, so for full details and context, please refer to the video of the full meeting. Please report any errors so we can fix them. Report an error »

OPEGA briefing finds gaps in Department of Juvenile Services' community services, data and oversight
OPEGA presented its evaluation of the Department of Juvenile Services to the Judicial Proceedings Committee on Jan. 15, 2025, reporting that DJS lacks consistent oversight of community-based providers and that its case management systems limit the agency’s ability to measure outcomes.

The OPEGA director, Mike Powell, told the committee that his office was asked by the Joint Audit and Evaluation Committee to examine DJS performance. “We don't work for DJS. We're not contractors or consultants. We work for you,” Powell said, describing OPEGA as “nonpartisan, objective, sort of fact-based staff.”

OPEGA's report covered long-term juvenile referral trends, community-based nonresidential services, DJS oversight of contractors, use of SINS (children in need of supervision) referrals and case histories of 15 youths later indicted in a series of armed robberies and carjackings. The office identified several shortcomings the department can address quickly and several longer-term gaps that will require new data systems or additional resources.

Most important findings

OPEGA highlighted four areas of concern:
- Oversight and performance data: DJS’s contractor oversight is largely fiscal — ensuring invoices match contracts — with limited validated performance monitoring to show whether providers improve youth outcomes.
- Noncontractual referrals and tracking: OPEGA found many community placements are to programs with no contractual relationship to DJS and limited DJS visibility into whether those sites delivered services or produced results.
- Case management limits: The department’s ASSIST database stores narrative case notes but does not reliably support queries that would let analysts measure provider referrals, wait times or outcomes at scale.
- Intake decision inconsistency: The intake decision tool (IDT) is not followed consistently; in a sample of cases the tool’s guidance was followed a little more than half the time, and local practices varied widely by county.

Why it matters

OPEGA emphasized that roughly 6,000 DJS-involved youths a year receive community-based nonresidential services and that timely connection to services matters. The office found that only about 30% of sampled second-degree-assault intakes had evidence of a referral to a DJS-contracted community-based provider, and that delays from DJS intake to provider intake often ran 30 to 40 days. OPEGA warned those waits can blunt the effectiveness of early interventions.

Selected published figures cited in the presentation included: recent monthly referrals around 1,000 (down from as many as 3,000 a month 10 years ago), about half of community-based youths are informal cases and half are probation cases, and a long-term shift toward judges imposing probation rather than commitment (roughly an 80/20 split for probation vs. commitment among adjudicated youth in recent years).

OPEGA recommendations

Powell told the committee OPEGA recommends DJS enhance ASSIST or adopt a modern case management system; replicate local best practices statewide; collect and validate performance data for contractual providers; improve visibility into providers’ staffing and capacity; and expand evidence-based providers in areas of unmet need (OPEGA specifically noted no evidence-based providers were available in Baltimore City for a top-tier service type).

Powell said his office had attempted an outcome analysis — whether youth who receive community services fare better than similar peers who do not — but the available data were insufficient to produce a defensible published finding. “Ultimately, we didn't come up with an answer to that question that we felt confident that we could publish,” Powell said.

What the committee asked

Members pressed OPEGA on staff interviews, whether employees felt free to speak candidly and whether site-level staff were consulted. Powell replied staff were “very transparent” with data and candid in conversations. Senators also pressed OPEGA on recidivism measures, geographic differences in intake decisions, and the practical implications of departures from the intake decision tool.

Ending

OPEGA delivered a fact-focused inventory of process gaps and evidence shortfalls that it said DJS and the legislature can address. The report urges faster, validated data collection, clearer performance measures for providers and targeted expansion of evidence-based programs where children lack access to proven services.

View full meeting

This article is based on a recent meeting—watch the full video and explore the complete transcript for deeper insights into the discussion.

View full meeting

Sponsors

Proudly supported by sponsors who keep Maryland articles free in 2025

Scribe from Workplace AI
Scribe from Workplace AI