OpsIntelligence
From reactive firefighting to proactive decision-making.
Helping energy-intensive teams anticipate risks, act faster and decide with confidence.
AI/ML
Operations
B2B

VISION
CONTEXT
Energy-intensive operations face constant volatility in market prices, energy inputs, and equipment health. Information is scattered across tools, and status tracking depends on manual follow-ups.
Result: delayed response, blind spots in situational awareness, and missed opportunities to act.
Modeled Outcomes + Impact
Based on workflow simulation and heuristic task-timing analysis.
Approach
Mapped the operational request journey to reveal signal gaps and handoff pain points.
Defined Jobs-to-be-done: detect early, assign fast, stay informed.
Identified delays, redundant loops, and user frustrations across roles.
Map AI Opportunities
Pinpointed where prediction, explainability, and auto-drafting could drive measurable value.
PERSONAS
Understanding Key Users and Contexts
Synthesized insights from cross-role operational observations.

Office-based. Monitors prices and submits operational requests.

Validates requests against live telemetry and equipment health.

Executes approved requests under variable on-site conditions.
Mapping the Operational Request Lifecycle
Mapped the request journey to reveal delays, validation loops, and blind spots.
Energy Traders
Raise requests from partial signals.
Tech Ops
Manually re-validate against telemetry.
Site Ops
Lack unified, real-time context during execution.
This revealed where prediction, explainability, and auto-drafting could reduce friction and deliver measurable value.

The journey revealed signal gaps, manual revalidation loops, and fragmented execution clarity—showing where AI could close the loop with prediction, explainability, and automation.
Designing AI’s Role in the Workflow
Designed a flow that mirrors real-world handoffs, to integrate AI meaningfully:
Detection
Detect anomalies using telemetry and historical patterns.
Prediction
Surface predictions with confidence scores and ‘Why this?’ reasoning.
Work Order Draft
Auto-draft work order with context prefilled for review.
Execution
Human reviews, edits, and approves (accountability preserved).
Feedback
Feedback (dismiss/accept) trains future drafts.
Operational Frictions and System Gaps
Anomalies surfaced late, often after impact.
Managing multiple mining sites needs monitoring vast amounts of telemetry data.
AI ↔ Human Interaction Loop
Mapping how AI predicts, explains, and learns from user input.

Guiding Principles
Proactive by Design
Surface risks early. AI leverages telemetry and market signals to suggest the first action — cutting response time.
Work Order Draft
Approve & Create
User Control First
Dismiss Prediction

Help improve future predictions (optional)
Select Reason
Save
Trust through Transparency
Show the “why” behind predictions through visible confidence levels, telemetry, and reasoning.
View AI Analysis
Key Design Features
Core design features that make AI-powered operations more anticipatory, explainable, and human-centered.
1 Predictive Risk Detection
• Surface anomalies early with sparkle-badged site chips and list views.
• Confidence and severity indicators appear inline for quick prioritization.

Pre-filled requests cut manual drafting effort by ≈60%
2 Explainable Reasoning
• Inline AI analysis explains why an issue was flagged — with clear telemetry references.
• Tooltips and expandable views balance quick scanning with deeper exploration.
Telemetry context makes AI reasoning instantly verifiable.
3 User Agency: Dismiss + Feedback
Captured reasons refine future predictions and reduce false positives.
Dismissed predictions can be restored, reinforcing transparency and user control.
4 Proactive Solution : AI-Drafted Work Orders
AI pre-drafts work orders using telemetry and past incident data.
Users edit, approve, or reject — maintaining speed and accountability.




5 Insights & Trends
Aggregates recurring anomalies to anticipate future risks.
Outcomes & Value
Predictive Risk Detection
↓ ≈ 60% drafting effort
Explainable Reasoning
↑ ≈ 35% user trust
Dismiss+Feedback Loop
Practive Work Orders
↓ ≈ 40% Response time
LEARNINGS FROM THE PROJECT
This project sharpened my ability to identify where AI truly adds value in enterprise workflows. Not as a feature bolted on, but as a material embedded into the process. The exercise balanced system-level design, explainable AI, and human oversight, building trust in automation while respecting operational expertise.











