Hi, I’m CK Zhang

I’m a high school cognition researcher turning prototypes into stubbornly practical workflows for clinics and classrooms.

Scroll
Portrait of CK Zhang
2,095 Webcam sessions in the EyeCI dataset
60 Field recordings gathered onsite
3 Open projects I maintain

About

I build clinic-ready tools that stay personal.

I’m CK (Kevin) Zhang, a designer-engineer focused on cognition tech that has to work in crowded clinics and shared classrooms. I take scrappy prototypes, steady them through fieldwork, and hand them back as repeatable workflows.

EyeCI started because a volunteer trip in Wencheng County showed me how quickly care teams lose trust when screens expect the wrong behaviour.

Where “just look” came from

I joined a community health team running MMSE screenings and watched seemingly healthy neighbours screen positive for cognitive decline. It hit me how many clinics miss cases simply because screening is expensive and rare.

Over a weekend I hacked together a webcam tracker on a tablet, assuming touch would feel natural. The next day the line stalled—participants hovered, pressed too hard, and calibrations fell apart. I returned with one instruction: “just look.” Switching to gaze-only calibration relaxed the room and produced clean signals. By summer’s end we recorded about 60 sessions—proof that low-cost screening could work.

That moment now echoes through everything else: I open-sourced the EyeTrax toolkit, grew EyeCI to 2,095 webcam sessions, and partnered with clinics under Project Argus to keep that gaze-only flow alive.

The throughline: tools earn trust when they honour the people using them. Scroll on for the proof.

Highlights

Snapshot wins at a glance.

A quick look at the collaborations and artifacts that admissions officers and partners ask about first.

Clinic pilots

Project Argus is deploying EyeCI in outpatient clinics.

Co-leading a bilingual rollout with a healthcare analytics partner, complete with onboarding kits, reliability scripts, and release conference coverage.

See collaboration timeline →

Open research

EyeCI’s dataset now spans 2,095 webcam sessions.

Built on EyeTrax, the open-source tracker, and validated across 60+ field tests — the same lessons referenced in my origin story above.

Revisit the story →

Research origins

Pioneer Academics EyePy paper sparked the journey.

My first research program taught me to frame affordability questions, leading directly to EyeTrax and later EyeCI.

View the Pioneer paper →

A timeline of learning by doing.

Each milestone marks a shift in how I understand people and systems — from that first clinic visit to the plugins I wrote to care for my own writing practice.

STEM club afternoons

Figuring things out at STEM club

After school I stayed for our LEGO engineering club. We tried small builds, showed quick demos to classmates, and I helped younger kids plug in sensors.

Most afternoons were spent tweaking LEGO rovers and simple Scratch games and seeing who else wanted to try them.

Young CK at a LEGO engineering club table during an after-school meeting
Role: Student · Curious tinkerer
EyePy · Pioneer Academics

EyePy — First research sprint on affordable gaze tracking

During the Pioneer Academics program I built EyePy, a webcam eye-tracking prototype using dlib.

I tested low-cost webcams, logged calibration drift, and kept the pipeline reproducible.

The write-up captured the method and set up the follow-up questions for EyeTrax and EyeCI.

Preview of the Pioneer Academics EyePy research paper Pioneer Academics EyePy paper · Aug 2024 Role: Student researcher · Pioneer Academics
EyeTrax

EyeTrax — Open-source webcam gaze toolkit

EyeTrax is the follow-up toolkit for classmates and researchers who need reproducible webcam gaze tracking without special hardware.

It combines MediaPipe landmarks with an Elastic Net calibration loop, live plots, and documentation that teams can follow.

Those pieces became the base that EyeCI now uses for screening work.

Role: Maintainer · Open source
EyeCI

EyeCI — Webcam cognitive screening research

EyeCI turns the EyeTrax toolkit into a screening workflow. I designed tasks, ran sessions, cleaned data, and built the model pipeline.

We ran the VisMET and VPC-5 tasks with nine-point calibration for 2,095 participants and kept scripts for the cleanup.

A lightweight model converts the trajectories into screening probabilities, with reliability checks and plain-language onboarding for clinics.

Role: Solo creator · Research to product
Project Argus

Project Argus — Clinic collaboration founded on EyeCI

Project Argus takes the EyeCI workflow into outpatient clinics. I’m working with a healthcare partner on product fit, documentation, and hardware-light deployment.

We put collaboration agreements in place, rewrote calibration flow, onboarding scripts, and caregiver reports for pilots.

I’m keeping the open dataset available while adding the privacy integrations partners ask for.

Role: Founder · Research translation
Latest Reddix

Reddix — A lightweight Reddit client that lives in the terminal

I wanted Reddit in the same terminal pane as my tools, so I built Reddix.

It handles login, sorting, and saved threads with Rust async pipelines for paging and caching.

The launch picked up 283 GitHub stars in 24 hours and now sits at 500 stars today.

Role: Solo builder · Rust experiments

Build journals — projects with room to grow.

Case studies are in the works. Until then, here’s how I’m framing the work and where I’d love feedback.

Reddix

Terminal Reddit client I use for quick navigation, offline reading lists, and sharing threads without leaving Kitty.

It keeps feed and detail panes, saved collections, and quick search filters in the terminal so research threads stay organised.

Rust async handles pagination, caching, and flaky connections, and the launch grabbed 283 GitHub stars in a day — now 500 stars today.

Project Argus

Project Argus is the clinic version of EyeCI that I’m building with a healthcare partner. It adds reliability checks, guided setup, and reports caregivers can read.

Current work is user testing with nurses and community health volunteers to make sure the flow holds up in clinics.

I’m packaging screening results with confidence notes, next steps, and bilingual documentation so reports make sense quickly.

Let’s build the next chapter.

I’m assembling college applications and open to opportunities that combine research, design, and community impact. If you’re exploring similar questions — or need a collaborator — I’d love to hear from you.