One Public Servant’s Guide to the AI Frontier
I didn’t set out to become an AI explorer. I supervise jury operations. I coordinate jurors, manage panel calendars, and help the public navigate their jury service. My world runs on logistics, policy, and people. Not algorithms. Not model weights.
Then I got an email.
Cal Poly. AWS. AI Summer Camp. I had applied on instinct after seeing a flyer that said no AI experience required. That felt like a door quietly opening. A few weeks later, I learned I was one of 100 students selected from a pool of more than 1,300 applicants.
That email didn’t turn me into someone new. It reminded me I didn’t need to be. I could learn this in my own way, on my own terms, from the seat I already occupy. Right inside the public systems people rely on every day.
At camp, I worked on a real-world AI solution to improve accessibility for students with ADHD, autism and other learning challenges. I learned to parse PDFs with Python, deploy a chatbot using Claude, and help wire a full-stack MVP with teammates I had just met. I also built my first solo hackathon project, learned Git, and shipped a working prototype. All in five days.
Now I bring that same mindset back to court operations. I combine AI curiosity with public service experience to ask better questions and build what people actually need.
This site is not a polished playbook. It is a working record of experiments and ideas in motion. Notes from the field. A place to learn in the open and create space for others who never thought this world was meant for them.
Latest Project:
I just published Become a Builder: A reflection on how I went from unsure to MVP shipped in one week at AI Camp.
This quick walkthrough shows a prototype AI chatbot answering juror eligibility questions using real California legal references. I built this over a weekend with AWS, Python, VS Code, ChatGPT and public data. It’s a proof of concept for how AI can support public service and access to justice.
Read all about it in Proof Of Possibility
Lessons, patterns, and prompts from the AI trail
🧭 Navigating unfamiliar systems with familiar instincts.
🧠 Thinking like a public servant, not a product manager.
📎 Translating messy real-world workflows into clean prototypes.
🧩 Finding the edge where analog meets digital.
✍️ Writing to understand, not just to explain.
🧵 Following the thread when something feels off.
🗣️ Asking, “Who is this for?” before building anything.
📓 Visit the Blog for longer posts and project reflections.
ChatGPT – Core to my workflow for writing, idea development, and Python support
Amazon Bedrock – Used to deploy foundation models (Claude) in live AI prototypes
Claude via Bedrock – Integrated into my tutoring and jury chatbot projects
Google Colab – Python scripting, experimentation, and AI notebook development
GitHub – Project versioning, collaboration, and public documentation
VS Code – Main environment for editing scripts and debugging
Git Bash – Command-line tool for Git version control and repo setup
LangChain – Used for retrieval-augmented generation (RAG) architecture
FAISS – Vector database used for semantic search in AI prototypes
Boto3 – AWS SDK (learning in progress for deeper integrations)
Streamlit – Used to create public-facing interfaces for AI prototypes
Power Automate – Automating public-sector workflows like jury ops and data routing
Excel – My go-to tool for real-world court and jury data analysis, cleanup, and tracking
Amazon Skill Builder – My foundation for AWS, generative AI, and cloud tools
Google Sites – Where this portfolio lives and evolves
Lovable – Low-code builder used for UI-based tools and my first hackathon entry
DynamoDB – Backend database for student-tutor matching logic
Amazon S3 – Storage for documents, tutor strategies, and AI reference files
PyMuPDF – Used to parse and clean uploaded PDFs for model input
Matplotlib – Used for storytelling in data (featured in Google DA Capstone)
Supernote Nomad – My offline thinking and digital journaling tool
Where data storytelling meets Lakers basketball:
theforumfiles.substack.com
Uses real NBA data and statistical tools (Google Sheets, Colab, Matplotlib).
Applies concepts from Google’s Data Analytics certificate and Calbright College coursework.
Focuses on visual storytelling, performance trends, and player dynamics.
Explores the emotional and narrative side of sports analytics.
Reflects a growing skillset in analysis, pattern recognition, and audience communication.
This site is a living notebook. Expect drafts, questions, and the occasional detour.