Comma is my favorite “AI” company. Really incredible piece of tech in a tiny package, and it truly improves your life to have it.
I wish it worked with my Mitsubishi Outlander, but just having it on my Corolla is enough. Their supported brand list will definitely factor into my next car buying decision.
I like Comma. I like geohot (I wouldn't want to work for him I imagine it is intense) but I like the contrary attitude and also the product as seen as demoed on youtube.
I used to joke about how I used LinkedIn as a dating site, but in the current year this just isn't that funny anymore. The professional managerial class I was mocking is quickly losing its grasp on power, so it's unclear if I'm still punching up.
How many of the messages here even come from real people anymore? How many of those people have pronouns in their bio? Are they GPT/GPT?
We have a big task ahead of us to define what the "new business" looks like. Comparatively, the revolution is the easy part.
> Currently, openpilot performs the functions of Adaptive Cruise Control (ACC) and Automated Lane Centering (ALC). openpilot can accelerate, brake automatically for other vehicles, and steer to follow the road/lane. [1]
[Some of the] Cars that are currently supported already have "smart cruise" and "lane follow". Why then use a third-party self-driving system?
> My <device> already comes with built in <software> why would I install anything else?
Top voted comment on hacker news btw.
Ok that was probably unnecessarily snarky I hope you don't take offense, but it seems the hacker spirit has been fading more often from this site, we used to replace stuff with inferior versions just to see if we could.
What? It's literally open source, you can ssh into the thing and change whatever you want. I am running a fork of a fork of the code right now. I change things all the time.
It's infinitely better than HDA2 at tracking and maintaining lanes.
HDA2 cuts out if there is a break in lines more than 50ft or so.
Openpilot can track the slightest of roads, even able to follow off-road the tracks in grass from a leading car.
It does basically everything HDA2 does and then some, and does it much better.
It has a driver-monitoring camera that you control, that monitors for inattentiveness which is much more effective than simple wheel-torque based sensors.
Big one, because all those cars require you to touch and move the steering wheel every X seconds. All the ones that let you go hands free cost a subscription of around $500 a year (Ford BlueCruise, GM SuperCruise). And even those only let you use hands free mode on pre-mapped roads, typically only interstates.
Becasue most cars with lane follow still lose lock on the lane when the lines are hard to see (rains, snow, etc) or missing due to exists and other things.
Becasue most cars with lane follow fail to keep well when the turn gets too sharp.
Comma.ai lets you go completely hands free with no wheel nags. It also works just fine when there are no lane lines or poorly visible lines. It also supports lane change by signaling, and then nudging the wheel when it's clear to move.
There is also an experimental mode which stops and goes at stop signs and stop lights.
If the driver monitoring camera in the comma detects you fell asleep or something, it will slow the car down and pull over. All the stock lane keep that I have used in cars, if you fail to nudge the wheel they just disengage and you keep going at full speed in a straight line...
Then we delve into OpenPilot forks like SunnyPilot that let you do things like decouple gas/break control from steering control, so you can control the gas/brake yourself and let comma just always steer for you. Comma can also steer more aggressively in turns than any lane keep I have seen, and when it can't you will see the limit being reached on the little display so you know you will need to help out on that tight curve.
Experimental mode isn't the best all the time, and SunnyPilot allows hybrid mode which uses regular mode and dynamically switches to experimental mode for stop signs and stop lights.
With SunnyPilot it can even read your car's blind spot monitors to automatically make the lane change hen clear without you having to nudge the wheel.
Some have been playing with concepts of auto navigation too where the car will take exits and turn through intersections for you.
The comma.ai devices have 10W of compute power and the current driving models only use 1W, so there is room to scale to better models with teh current comma devices. There is also talk of supporting more cameras for side views and external GPUs addons with 100W compute for potential FSD level models.
> All the stock lane keep that I have used in cars, if you fail to nudge the wheel they just disengage and you keep going at full speed in a straight line...
is this true for current EVs as well? My 2015 Tesla S brings the car to a controlled stop with hazard warnings on.
AP is definitely the exception and would safely stop you. But also AP is gone from new Teslas now and the comma costs what ir costs to subscribe to Tesla’s driving features for just 1 year.
All EVs probably stop fairly quickly because they brake when you aren’t pressing the gas or cruising. But I don’t know any that keep steering for you when they disengage when you fail to maintain attention.
Commercial implementations back when this launched was vastly inferior to it, if users' accounts are to be believed. Obvious signs of too high P in PID and such.
Tesla Autopilot was always available, but they were as sketchy as it always had been. Shoving the head into road barriers and fire trucks with rear ends that were less car looking especially to pre-LLM image recognition models.
OpenPilot also allow retrofits. People who own 2017-2023ish cars, shipped between the times after self driving hype took off and before command signature enforcement was widely implemented, can DIY self driving without re-buying the whole car, put aside whether it's legal or whether you should.
It beeps at you if you stop paying attention, which is superior. Hands on wheel is an arbitrary design decision more likely to placate what a layman would think is necessary to ensure safe AI steering.
My car judges it if I have put in any manual inputs over the past 10 or so seconds then it starts complaining. Which is seemingly reasonable however there's plenty of nearly perfect straight aways where there's nothing to do for it or me.
It would be nice if it had a system where if it isn't doing anything, it doesn't think I'm not doing anything either.
Except those straight, boring roads that require no input are also exactly where and when I most want to use autopilot. This means I have to manually adjust to keep the car happy, instead of letting the well-aligned car just carry on. Autopilot ends up being more work, and more annoying, than just driving myself
I wondered the same thing but after trying a few oem attempts, there’s definitely huge room for improvement. Lane following isn’t very ‘smart’ and doesn’t take context into account (ie. changing position in the lane based on clearance from other vehicles, potholes, upcoming curves etc.)
Lane follow? Does it have lane discovery? There was snow on my commute this morning. 4-land highway was basically follow the leader. Pick some line where you think there is the most traction and stick with it. I have yet to see footage of an autodrive system in such a situation.
What's hilarious to me is that this scenario is framed as such an impossibly difficult thing for self-driving technology to accomplish. Detecting a car in front of you, and maneuvering left or right is doable without even using advanced models, nevermind the fact that we have them now. The other supposedly impossible feat is for the self-driving car to create a lane when there's been so much snow that the lines and thus lanes aren't visible. Given high quality sensor data, does it really seem that impossibly hard for the computer, which is already competently driving on the road in practice, in SF and Phoenix and LA; it seems impossibly hard for that computer to take the full width of the remaining road, divide in two, assuming it's a bidirectional road, and then create as much lanes + safety margin as can (safely) exist, and then pick one? Proof is in the pudding and all, so here's a 4 year old video showing that comma.ai's capable of that, in the sunny winter but snowy road condition in that video.
No cars to follow. No lane markings. Just a sheet of white with ditches on both side. Worse yet, no prior practice. Where you know the lanes are means little when the other cars are not respecting them. (As if lanes matter as nobody wants to drive that close together.) It isnt impossible but certainly a far more difficult problem than navigating an LA suburb.
Nobody is claiming that this will work in that situation, but there are thousands of much more common situations where it does work and makes driving more safe and enjoyable.
I have a car with smart cruise, but there's plenty of room for improvement. It isn't very smart at determining when it can avoid braking, such as when a car well ahead has slowed for a right turn. It also brakes too aggressively when someone cuts in front of me on the highway, in situations where just lifting off the gas would be better.
It also times out very quickly when traffic comes to a complete standstill, requiring manual intervention to get going again, and it doesn't give any indication to the driver when that occurs.
If these things bothered me much more than they do, I'd be interested in comma.ai as a possible solution. As it stands, the OEM radar cruise control is "Eh, good enough, I guess."
- InsureCo, how may I help you?
- Hey, I want to ask about installing a self driving module in my car...
- Sure, you mean Tesla upgrade?
- No, another one.
- Another one?
- Yeah, you remember that kid that hacked Playstation?
Perhaps you're thinking of Bunnie Huang, who wrote a book about reversing the xbox. I love Bunnie because he seems to be in it for the joy and the sharing of information.
Geohot (IIUC) hacked the iphone because apple didn't allow devs to run their own code at launch, and the playstation because sony removed the ability to run linux on the console. I love geohot because he seems to be in it to stick it to the man.
At the moment in every jurisdiction I’m aware of the driver is always considered as “in charge” of the vehicle no matter what assistance functions are being used. It’s the driver’s responsibility to avoid collisions in all cases.
If you have a collision and your vehicle is judged at fault by whatever authority does it in your area the you are liable.
Mercedes Drive Pilot (“SAE Level 3”) is certified on some very specific stretches of insterstate in California to not require the driver to be responsible.
that's really dumb of Mercedes take on that liability for little benefit - sell more cars, make more profit? My prediction is MB drops this or goes bankrupt in the next 10 years.
It's a marketing gimmick. The conditions under which it can be used are so restrictive that it's really not useful which means it will be rarely used so Mercedes exposure to liability is really quite small.
Not sure you understand how "The Formula" works. The profit generated by adding this feature will outweigh the cost of any resulting accidents that they take liability for.
A less pessimistic way of phrasing it is that within the boundaries they've defined, their self driving system is so much better than a human that they're willing to assume responsibility for crashes deemed "at-fault" while using the system.
Not intentionally trying to compare that with other automakers, but Mercedes is the only "you can buy now" vehicle (ignoring robotaxis/Waymo/others) that assumes liability with those capabilities. Until other automakers provide that legal guarantee, they're parlor tricks at best that will continue to get folks killed in scenarios that they otherwise wouldn't had they been actually paying attention.
"Your honor, I don't know how to explain this to you any more simply. I wasn't driving, there was a brick on the gas pedal. It's not my responsibility, not my fault!"
Well that will depend on your local laws, but to my knowledge except for certain authorised pilot programs all cars on the road must have a driver.
Where I live if you are in the driver’s seat no matter if you were actually actively driving you are considered to be the driver. This has been well established here in drink-driving cases, but you’d have to ask a lawyer for your area.
In an accident, culpability cannot transfer to a computer ostensibly running under your supervision. As a driver, you likely sign away all claims to blaming CommaAI when you accept the EULA & ToS updates.
I mean, just like with a Tesla, the driver is responsible for the actions taken by the car, which means you do need to be paying attention, hands on the wheel, ready to take over at all times.
We don't yet have the legal framework to say 'Sue company x, it wasn't my fault!' You get sued, then you have a very uphill battle to turn around and try to sue the company that provided the 'self driving' functionality because companies put all sorts of 'I totally accept liability for using this' in the T&C of their products.
No idea how Fridman manages to bring on the type of high profile guests that he does. Guy does not ask good questions and has the charisma of a wet rag,
Huh, I'm the exact opposite. With the exception of Hannah Fry's work at deepmind (where she acts as a charismatic proxy for the more nerdy guests), he is by far the best interviewer on technical stuff (AI stuff mostly, but some early robotics stuff as well). He knows the field, he asks pertinent questions and more importantly he knows when to just let the speaker speak.
Compared to someone like Dwarkesh, it's night and day. There's a fine line between pushing the guest and just interrupting them every 2nd thought to inject your own "takes".
I think similar to Joe Rogan that's the main value he provides to listeners. He identifies guests that have some veil of intellectualism and provides them with a platform to speak.
However I don't think that makes for an interesting interviewer. There are no challenging questions, only ones he knows will fit into the narrative of what the guest wants to say. I might as well read a 2-3 hour PR piece issued by the guests.
Guests don't care about charisma, they care who your previous guests were. He early on got Elon Musk as a guest (AFAIK by writing a paper that was overly favorable to Tesla) and managed to snowball that into a big podcast.
Also guests agreeing to go on your show means they already want to talk about something, so in a way it's more important to shut up than ask good questions.
Seeing things like, "<h2 id="new-driving-model">New driving model</h2>" on their list of latest releases does not inspire a lot of confidence. Yes, the HTML tags are displayed on the page. Some basic quality assurance on the website would help me trust the quality assurance applied to their product offering.
Ability to decouple gas/brake control from steering, so you can control gas/brake yourself and let comma steer at all times. With OpenPilot, when you touch the gas/brake, the steering also stops.
Ability for the comma to read your car's blind spot monitors and automatically change lanes without you having to nudge the wheel.
Ability to use dynamic mode which dynamically switches between chill mode and experimental mode, so you get the best of both worlds.
Ability to fine tune many settings related to gas/brake and steering sensitivity and control that you can't play with in OpenPilot.
That's the main differences i'm aware of, but there are more.
When Consumer Reports tested ADAS systems in 2020 [1], they gave their highest rating to the Comma Two. I'm sure it's only gotten better since then.
I really like my 2017 Chevy Bolt except that it doesn't have ACC. I wish I was comfortable installing a Comma on it, but it requires a gas pedal interceptor [2], and I'm not willing to do that to a car that I transport my family in.
I'm sure the technology is great, but what would be really great for me to use theoretically use it would be for it to be if the company was liable if it caused an at-fault accident. I don't know much about the law around this, but I comfortably get in a Waymo all the time because I have some intuition about it that tells me that their lawyers are scared shitless of killing someone. It's a hard sell for me when it says "self-install at your own risk" but I appreciate the effort.
I got my Comma 4 a week and a half ago and absolutely love the thing. Is it perfect? No. But I can install a fork (Sunnypilot) and tweak the settings and/or code until I get something that I like.
Waymo put a ton of sensors, including lidar and cameras, to create a truly driverless experience. Tesla put a ton of cameras to make a mostly driverless experience (but when you need a driver you NEED a driver). Comma strapped a cellphone cam to the windshield to make a semi driverless experience on straightish roads, and a driver must take over when anything complex happens. Source: very happy comma.ai user for many years now
It’s NOT self driving. It’s level 2 driving assist. Really good one, but that has nothing to do with self driving. You are driving the car all the time, it’s only assist that can (and will) try to kill you (and others) with 0 notice if you don’t pay attention.
Ironically - bad ones will effectively try to kill you less.
That’s the thing about any automations that are just aides. Humans are extremely bad at monitoring machines, and if aide system is good enough that trick you into thinking it’s actually stand alone and in control, you get complacent very fast, stop pay attention as you convince yourself that automation got it.
So bad level 2 driver assists are so bad, that no one will get complacent, as they give you only very minor help. Really good ones (like comma) can trick you into thinking that they can do much more than they’re designed to do.
Everyone that I have heard from who uses one say that it reduces the monotonous mental load and frees up their brain to focus more on being aware of other drivers and what's happening on the road, rather than worrying about their speed and exact steering.
People report being more alert and more aware of things about to go wrong.
I've made a similar experience. Not having to focus on my exact speed all the time (using adaptive cruise control) enables me to watch the mirrors more. It also enables me to keep my attention over multiple hours.
On a kinda unrelated note: I lately see more and more people watching streams or Tik Tok while driving. If of course you use your newly acquired freedom for that it will lead to more accidents.
One thing I really appreciate about cruise control is that my foot can be floating above the brake at all times.
This cuts an entire second off of brake times in motorcycling, where it's just a quick hand move that you MUST do (well, 80% of it, the throttle release part) anyway because if you leave the throttle on you will wipe out, so I'd assume it cuts even more for a move like a large lateral move of a leg in a car.
Only issue is that you need to be careful with that foot to avoid keeping your brake lights on at all times.
Note that this might only apply to driving stick. I'm not sure if left foot on brakes is best practice in automatics.
It would be smart but as I understand they are doing the opposite, taking measures to lock down the electronics in future vehicles. Many (most?) of them have already been thwarting owner access to certain diagnostics for years.
We badly need right to repair and right to tinker laws. Or better yet a "thou shall not employ DRM against the legal owner of a device" commandment.
Anyone who hates driving, being stuck in traffic, anyone who benefits when the cost of transportation is cheaper, anyone who hates insurance, all of society benefits. In some parallel universe self driving and fusion are both crash projects receiving well administered social scale funding. Lots of things are. It is somewhat miraculous that we don't all live in this conclusion, realizing these huge value creation opportunities by investing aggressively in the upstream tech. There is a horse I would very much like to drown for only a few million USD.
None of that is solved by automated driving. You want public trains, BRT bus systems, trams, etc. The ideal universe is you stepping on public trans, not piloting various rube goldberg-esque machines that are far more dangerous and will always contribute to traffic and "one more lane" does not work.
Public transit is great but it is not a catch all solution. Farmers need trucks, people in rural areas need to get places at odd hours. Drunk people need to be ferried to distant places. Automated cars and good transit are not mutually exclusive.
Sure, public transit can be nice. But so is owning my own vehicle that isn't subject to routes, schedules, and minimal luggage constraints. I'd much rather hang out and read a book or play a video game than babysit a vehicle in stop and go traffic for half an hour. Even if traffic is moving at a decent clip I'd still rather do something else.
Flexray is supported only on Audi on a specific Flexray-canbus integration board that was just end of lifed and Comma has no desire to support other boards. It was a proof of concept for anyone wishing to implement the same
Wonder if it will be able to work with the Slate pickup when that comes out. Seems like it would be a perfect pairing if the Slate has enough control exposed to it.
8 years later, comma.ai is still standing and operational despite several VC backed competitors raising significantly more than Comma and those competitors (except for Tesla) are now no longer in business.
People here have no idea they are looking at a robotics and AI company which that is Comma.ai
Some cars it does not support because nobody has been interested in testing it out. Some cars will never be supported because they are using an encrypted CANBUS. The list of cars on that list goes up every year. Eventually no current models will be supportable.
It requires steering and cruise control to be able to be controlled by the random pieces of code downloaded from GitHub.
So a 1920s Fords are out, and 2035 BYD flying cars with post-quantum cryptographic command signature enforcement are out too. Toyota bZ sits somewhere in the middle of those. IIRC they got past some types of Toyota security keys but not all.
Depends on the car what it has access to. It may leverage radar data, for example. Some cars (Ford Lightning for example) it only does lateral (steering) control, longitudinal control is still under the control of the OEM adaptive cruise.
This was back around 2017-19 (can't remember the exact year), but it was definetly pre-Covid and left a bad taste in my mouth, just like Jane Street and FTX at the time - especially given that your then competitor Deepscale didn't have that style of interview.
I was in the market for this for my Pacifica but I couldn't figure out what this does exactly.
Is it FSD basically?
Is it just lane assist?
Can I put an address in a map and it takes me there?
Very hard to just get these concrete answers, maybe they just take the newbie experience for granted and assume people know these answers. Anyone who owns one of these can answer? Thank you!
Generic Openpilot out of the box is just super nice cruise control right now. So it can do longitudinal and latitudinal control. So it lane keeps, stays behind the car in front of you, etc.
If you use Sunnypilot or one of the other friendly forks, you can do more, but it's not (currently) to the state of Tesla's FSD.
Personally, I recommend buying it if you do a lot of road trips. It's amazing for that. In/around town it's only useful if you have a lot of stop and go traffic, like if you live in LA or other large car-centric city with a big commute.
No it’s not FSD. There is no navigation at all, you’re correct that it’s “just lane assist”. But the lane assist is next level.
I take a few 1,000 mile plus road trips every year and the comma pays for itself every time. Using the stock lane assist, I’m constantly correcting it. The stock assist tries to take an exit, doesn’t handle curves well at all, and any construction or unusual road conditions it won’t work at all.
With the Comma, on the highway it’s basically FSD. On my last 1000 mile trip I never had to disengage, only to pass and make turns.
The biggest advantage is Comma allows you to be completely hands off the wheel. Where lane assist forces you to hold the wheel at all times.
I still use old comma branch running with OnePlus phone on Subaru. It works really really well, even on snowy northern roads. The code, from firmware C to python is very well written as well, makes it easy to tune it to your driving habits.
It’s open source bro, just read the code if you don’t trust it.
/s
That said I am intimately familiar with the code and it’s pretty well designed with safety in mind. Plus your vehicle has safety parameters that limits the ability for the software to do something insane. That said there are a few stories of open pilot running into a curb, hitting a car in the neighboring lane, etc
Comma is anal about safety. Few years ago they went as far as to ban anyone tinkering with Ford trucks/vans via park assist commands since those had unlimited steeeing torque/angle (and speed spoofing).
Will this be deployed mostly by those with the worst judgment?
For example, that video is implied to be of some open source self-driving project, run on an active public road, at 42mph. A lot of sensible people would say that's irresponsible or unsafe, and not do it. Move-fast-and-break-things bros and narcissists, however, wouldn't see a problem.
I read at least one thread per day criticizing Tesla self-driving (which has hundreds of highly-paid engineers working on it) as unreliable vaporware, meanwhile I'm supposed to hack my car with some code off a GitHub repo?
I'll be adding this to my list of 101 creative ways to die, behind basement apartment in Venice, Italy.
Those companies worth billions like GM and Tesla perform extensive testing to prove to regulators their software isn't going to kill people and does not pose an unacceptable risk to other drivers on the road. Do you get to sidestep all that if you post your code to GitHub?
Why not? You are free to modify your vehicle in almost anyway you want as a consumer. Should someone putting some rain shields on their window require licensing and government testing for it because it might break off? Should generic brake pads or tierod ends require independent government testing or approval to be purchased and used?
Regulations don't exist to save people from their own stupid mistakes, they exist to prevent systemic abuses and dangers to the public in the pursuit of profit. And we already know from endless examples that corporations will knowingly let people die if their decision will increase profit margins. Not to mention the public doesn't have the ability to properly test or verify corporate designed and sold devices. Unless corporations provide all documentation related to the design and materials and code used, they should have special restrictions and regulations beyond what the average person does.
States have window tinting laws to help police to pull over, harass, identify, and/or profile people in the pursuit of profit and control. The main justification for window tint laws is "protecting law enforcement", which is itself is a bad excuse, but everything beyond that is definitely complete bullcrap because the tinting laws differ so completely between states, with some states not caring at all.
Many states with tinting laws have zero laws about maintenance or inspection of vehicles which is all but proof that the safety of the public is not even a real consideration. You can literally have sharp jagged chunks of rusty metal hanging off your car and its not illegal.
Yup, because you get to be personally responsible for any outcomes just like you would be if you were driving without ai assistance. If you aren’t comfortable building and testing an open source project then it isn’t for you.
People cry daily that cybertrucks should not be street legal because they do not meet EU safety regulations but gluing plastic gadgets to your window yourself and calling it "AI assistance" is okay because the driver is ultimately responsible?
I’m imagining it… marathon meetings, everyone worried about code standards, someone made Claude rewrite the whole thing in Prologue and is zealously arguing for it in a 900-comment PR.
And somehow half the time invested in the project is arguing about a code of conduct.
Tesla FSD is awesome. I use it almost all the time now, it feels safer than me driving. It's like having a private chauffeur. My disengagements are mostly nav related.
Are you really expecting me to read this paragraph all by myself? What am I supposed to do, load some text off of a Hacker News comment section? I only read paragraphs written by teams of highly paid experts.
geohotz, the infamous person who cracked the PS3 at the time. Been following him since that time and this project since he started it. His blogs have always teetered on the edge of unprofessional while remaining incredibly knowledgeable and insightful. Truly enjoy all his work.
I've been using comma 2 and now 3, for over 5 years and its my most favorite thing that I own. I would never buy an incompatible car going forward and got my tucson 2024 specifically for use with comma. I did once think I really wanted a tesla but I realized I just wanted self driving. I routinely tell everyone about comma + openpilot and am surprised I've never seen another driver on the road with one. People are still mostly in the stone age with respect to driving. Granted I think you are always solely responsible for your car when behind the wheel and should only treat it as an assistant but it sure does make driving chill.
FYI if I get hit by someone and I find out they are using comma, everyone’s is getting a lawsuit.
Is there a hard safety check for an insane steering angle? Full brake or throttle? ECC error? What happens!? That’s what a safety standard checks and certifies for.
Incredibly dangerous, irresponsible, and illegal to be using this around other people. At least Tesla vaguely pretends to work with regulators. The cute download your own firmware so they aren’t shipping an illegal device? Encouraging hands off, inattentive driving? Let’s see how civil court sees that.
You aren't going to like answer but yes, there are firmware safety checks + CAN checksums. And driver facing camera tracking your attention so you aren't asleep behind the wheel. Something that your Tesla apparently failed to consider https://driving.ca/auto-news/crashes/sleeping-tesla-driver-r...
Your attitude reminds me of how Microsoft fans talked down Linux and GNU; blah blah open source can never be as good/secure/stable as our billion-dollar commercial product because money/certificate
You have an auto industry lawyer on retainer to ask hypotheticals on Sunday evening?
First answer your own question (and ask lawyer while you at it), is every changed pushed by Tesla reviewed by EU/CA/US regulators? And then explain to me how your Tesla still allowed driver to fall asleep. FYI that was not a singular accident.
Your argument went from "cute GitHub project has no safety" to "I get money if tesla maims me".
I must ask you again, if Tesla, an example of your choosing, works with regulators and has every change certified and reviewed then how did they fail basic safety check?
reply