Tesla FSD: Human-AI Collaboration in Decision Making

Overview

This exploration is based on my everyday experience with Tesla Full Self-Driving, but rather than evaluating the system itself, it looks at how decision-making shifts when control is shared between a person and an AI system. Through moments of hesitation, intervention, and familiarity, it reflects on how judgment forms through actual use rather than just assumptions. Ultimately, it is less about what the system can automate and more about how people stay involved in the decisions being made.

Why I Finally Tried It

I have owned a Tesla for more than two years, but I only started using Full Self Driving about two months ago. Before that, people had recommended it to me many times, and my response was always no. It just did not feel safe to me. I also thought it was too expensive, because I assumed you had to buy it outright and there was no monthly option. Even though I work in the tech industry, I was very resistant to it.


A close friend who recently bought a Tesla kept telling me I should try it. At some point, I caught myself thinking that it was strange; I work in tech, but I was refusing to try something without really knowing how it worked. That was when I decided to give it a try and be more open minded. After I started using FSD, I found out that another close friend’s uncle had been using it since it first came out in the US and had not had any major problems. That surprised me. It made me realize how slowly I had responded to the technology, and how much of my hesitation was driven by perception rather than experience.

How It Feels to Use It

Using Full Self Driving does not make me feel clearly like a driver or a passenger. That feeling shifts depending on where I am and how familiar I am with the road. When I drive somewhere unfamiliar, especially in areas where I do not know the traffic patterns well, I feel more like a passenger. On familiar routes, like going back and forth between home and work, I still feel like the driver.

During the first week of using FSD, I took over a lot. That phase felt less about trust and more about learning how far it could actually go. Sometimes the system signals a lane change before I even think about changing lanes. Those moments made me aware that it was constantly observing its surroundings, even when its timing did not match mine.

Learning Where to Step In

At the beginning, I took over very often. Most of the time, it was not because something had gone wrong, but because I was still figuring out how much I should let it do on its own. During that first phase, taking over felt more like part of the learning process. I would let the system try in different situations, intervene when I felt uneasy, and gradually learn where I actually needed to panic and where I did not. As I kept using FSD, I started to recognize patterns. I noticed that what mattered more was not whether the system made the right decision, but whether I understood what it was likely to do next. Over time, I began taking over less frequently, not because I trusted it completely, but because its behaviour became familiar.

Before I started using FSD myself, I used to notice some Teslas hesitating or shaking slightly in the middle of the road. It always looked confusing to me. Now I realize those moments are probably just people like me, negotiating control with the system. I do that almost every day.

When the System Asks for Feedback

Every time I intervene, the system asks if I want to submit an anonymous recording to report what happened. I do it very often. I understand that these recordings are being collected so the system can investigate specific situations and improve over time. Because I know that, submitting feedback feels almost automatic to me. Sometimes when I forget to do it, I even find myself hoping I will run into the same situation again so I can report it.


A close friend of mine also uses FSD, but she has never submitted a report. It is not that she does not know the feature exists. She simply has no reason to engage with it in that way, and it never crosses her mind to do so. Seeing that difference made me realize who actually ends up contributing feedback. The system is learning from a very specific group of users, and I happen to be one of them. Who chooses to respond, and who does not, shapes what gets reported and what never does.

The Moments I Take Over

I take over most often during lane changes on busy roads. These situations require very quick decisions, and I notice that FSD tends to wait until it is fully sure there is enough space. In heavy traffic, that hesitation makes lane changes harder to complete. What feels awkward is when the system signals first but cannot follow through; the turn signal stays on, the gap never quite opens, and I end up cancelling it and taking over myself. There are also moments where it brakes later than I expect. Sometimes it hesitates at yellow lights at large intersections, and I step in to either accelerate through or brake earlier to avoid a last second decision.


Some of my takeovers are about safety. Others are simply about driving style. There are also moments that feel more human than technical. Sometimes I want to yield to another car that is clearly trying to change lanes. It is not because I have to, but because it feels like the right thing to do in that situation. The system does not do this; it waits, follows its own rules, and does not account for that kind of informal coordination. Those moments stand out to me because they are not about efficiency or correctness. They are about reading intent and responding to it.


Everyone drives a little differently. Some decisions feel very personal, like when to yield, how assertive to be, or what feels comfortable in tight situations. Right now, the system only offers three modes: chill, standard, and hurry. I have tried all three, but I cannot really tell the difference between them. Because of that, some takeovers feel unnecessary. It is not that the system is doing something unsafe. It is just not driving the way I would.

What This Changed About How I Think

Using FSD made me realize how far these systems can already go, especially in environments that are still shaped by human behavior. In unfamiliar situations, I rely on it more, while in familiar ones, I stay more involved. There are moments when I am not sure what the right decision is, and in those cases, I sometimes fall back on what I believe is the system’s baseline, which is avoiding collisions. Leaving control to the system in those moments felt less like trust, and more like choosing not to overreact.


I also started thinking more about who adapts more in this relationship. People with strong opinions about how they drive may struggle to adjust, while others may become more dependent on it. It makes me wonder about the long-term impact on our own skills and judgment as we continue to share more of our daily decisions with AI.

Final Reflections

What stands out to me is not just how advanced the technology is, but how much openness it asks from the people using it. I avoided FSD for a long time without fully understanding why. Using it changed how I think about trust, responsibility, and what it means to make decisions alongside an intelligent system. Not everyone will have the opportunity or willingness to experience that shift, and that gap matters; it shapes who participates, who gives feedback, and whose habits and expectations are reflected as these systems evolve.


Working alongside AI systems has shifted my focus away from automation itself. It has made me more aware of how responsibility never fully disappears, and how human judgment, emotion, and interpretation remain part of the system. I now think of driving with FSD more as a collaboration. The system and I each take on different roles, stepping in where we are better at certain decisions. It feels less like handing control over, and more like working alongside each other.

Click to copy

serenakuo@hotmail.com

Vancouver, Canada

2026 Serena Kuo · Designed & built in Framer

Click to copy

serenakuo@hotmail.com

Vancouver, Canada

2026 Serena Kuo · Designed & built in Framer

Click to copy

serenakuo@hotmail.com

Vancouver, Canada

2026 Serena Kuo · Designed & built in Framer