Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Its not just the added cost, its the supply chain. Putting cameras into cars requires processors, ram, all manner of chips and compnents that a car didnt need before.

There was the chip shortage during covid which held car production back becasue the auto makers couldnt source their chips fast enough. I am waiting to see if the current supply issue for ram chips modules will produce a similar effect.

 help



> Putting cameras into cars requires processors, ram, all manner of chips and compnents that a car didnt need before.

Was there a single mass market consumer car sold in the United States in this millennium that didn’t already have processors and RAM in them?

I would be absolutely shocked if there was a single car for which the relatively recent backup camera requirement required them to introduce processors and RAM for the first time.


I’m pretty sure that you can buy aftermarket backup cameras. The car can be a dumb bunny, and still have a good camera.

Yeah, my 2005 beater has both CarPlay and a backup camera. Cost me $40 and an hour of labor.

oh yeah. I've once bought a $10ish one on Amazon out of curiosity.

There's the yellow composite plug, a 12V input, and a small bit of wire to be cut to rotate image 180 degrees, at the other end of a 30ft cable from the camera. The composite goes into the existing infotainment. There would be a wire from shifter to infotainment that switches the display to external composite video when the gear lever is in reverse. I think it even came with a miniature hole saw in size of the camera module.

$10 and one afternoon later, I could have upgraded a dumb car to have one, complete with auto switch to backup on reverse. No software hacking needed. It's fundamentally an extremely simple thing.


I believe that in some vehicles the backup camera actually runs on a separate (possibly real time, otherwise certainly heavily nice'ed) system. Tesla has a recall where they had to nice the backup camera software. The problem was if the display freezes or is delayed, then the driver is backing up and not aware that he doesn't see where he is going (he thinks that what he sees is representative of the area around the car currently).

In Hyundai and Renault I've seen it first hand that it's a separate subsystem that works even when the infotainment is dead/unresponsive/glitchy (it's like that probably everywhere, these two are just the sample I have).

Stability control, pre-collision braking, lane departure warnings, the complexity is pretty inevitable as we improve the safety of vehicles.

> Putting cameras into cars requires processors, ram, all manner of chips and compnents that a car didnt need before.

Call me old fashioned but in my opinion, processors/ram/chips/components are a good trade-off versus squished children


All cars have required "chips" since OBDII was mandated in the early 90s. That ship has sailed around the world, returned to port, and sailed again.

All of that is worth the extra safety.

I mean you can buy add-on 3rd party backup cameras for like $20. They don't have any cost excuses for including backup cameras, camera sensors and display screens are literally cheaper than dirt.

Legacy automakers still use these for upselling trims.

It's so silly when they make some "Advanced Technology Package" with a VGA camera and a 2-inches-bigger infotainment screen that's still worse than junk from Aliexpress, and charge $3000 extra for it.

I know it's just a profit-maximizing market segmentation, but I like to imagine their Nokia-loving CEO has just seen an iPad for the first time.


That's great for cars built before the regulation were put into place. Without that regulation, you'd then be dependent on the end user purchasing an after market part and installing it. The vast majority of them won't. So if it is so important to have, you make it part of the car. They did not leave seat belts up to the owners to install after market versions.

My point is that if a 3rd party manufacturer can produce and sell a combination screen and camera for $20 for a profit, an automotive manufacturer has no reason to complain about the "expense" of such a setup. It is even cheaper for them than a 3rd party addon supplier since they buy in larger bulk and can integrate mounts for those devices into the car, rather than trying to devise some sort of one-size-fits-all mounting system that the addon manufacturers need.

They might as well be complaining about the costs of a rear view mirror, it is nonsense from the start. If a $20 gadget breaks the bank on a $30,000 minimum vehicle, they are a shitty business to start with and we should all be clapping our hands when they go out of business.


The 3rd party guy isn't paying someone $40/hour to install the $20 unit. The $20 unit will not be as integrated into the car and will have the look of an after market part. Does the $20 part only come on when the car is in reverse, or is it on all the time? There's a lot of reasons the after market thing can be $20 and a lot of reasons the auto manufacturer's is not. It's not all down to greed

Was it ever a problem to get the kind of phone SoC or camera chips you'd need for a backup camera if you were willing to pay an extra $20? I thought the issue was more specialized things. And you need one gigabyte of ram or less.

A gigabyte!?

You shouldn’t need any dedicated RAM. A decent microcontroller should be able to handle transcoding the output from the camera to the display and provide infotainment software that talks to the CANbus or Ethernet.

And the bare minimum is probably just a camera and a display.

Even buffering a full HD frame would only require a few megabytes.

Pretty sure the law doesn’t require an electron app running a VLM (yet) that would justify anything approaching gigabytes of RAM.


I just went on Amazon and a 1GB stick of DDR3 ram is about 30% cheaper than a 128mb stick of RAM. Why would any RAM company make tiny RAM chips when they can make standard-sized chips that work for every application that needs less?

I really feel like a lot of the people objecting in this thread are people who have just written web apps in Python whose closest experience with the audio-visual space is WebRTC.

Tech for cars is “standard-sized”. Not everything revolves around datacenters and tech, the car industry easily predates the computer industry and operates on a lot tighter margins and a lot stricter regulations.

So having a smaller, simpler chip that ultimately costs less physical resources at scale and is simpler to test is better when you’re planning on selling millions of units and you need to prove that it isn’t going to fail and kill somebody. Or, if it does fail and kill somebody, it’s simpler to analyze to figure out why that happened. You’ve also got to worry about failure rates for things like a separate RAM module not being seated properly at the factory and slipping out of the socket someday when the car is moving around.

Now - yes, modern cars have gotten more complex, and are more likely to run some software using Linux rather than an RTOS or asic. But the original complaint was that a backup camera adds non-negligible complexity / cost.

For a budget car where that would even make sense, that means you’re expecting to sell at high volume and basically nothing else requires electronics. So sourcing 1GB RAM chips and a motherboard that you can slot them in would be complete overkill and probably a regulatory nightmare, when you could just buy an off-the-shelf industrial-grade microcontroller package that gets fabbed en masse, dozens or hundreds of units to a single silicon wafer.


Your CPU's L4 cache is normally DRAM, and it's cheaper to shove some RAM into a microprocessor than to have a separate chip.

I simply refuse to believe the cost difference between a CPU with hundreds of megs of DRAM is cheap enough to be an appealing choice over the same chip with a gig of RAM. We're not talking about a disposable vape with 3kb of RAM, this is a car that needs to power a camera and sensors and satellite radio and matrix headlights or whatever. If it's got gigahertz of compute, there's no reason it's still got RAM sized for a computer from 30 years ago.

The original comment was complaining about backup cameras seemingly adding significant electronics requirements.

In practice, you’re not going to tie intimate knowledge of the matrix headlights into the infotainment system, that’s just bad engineering. At most it would know how to switch them on and off, maybe a few very granular settings like brightness or color or some kind of frequency adjustment, not worrying about every single LED, but I can’t imagine a budget car ever exposing all that to the end user. Even if you did, that would be some kind of legendarily bad implementation to require a gigabyte of RAM to manage dozens of LEDs. Like, is it launching a separate node instance exposing a separate HTTPS port for every LED at that point?

Ditto for the satellite radio. That can and probably is a separate module, and that’s more of a radio / AV domain piece of tech that’s going to operate in a world that historically hasn’t had the luxury of gigabytes of RAM.

Sensors - if this is a self-driving car with 3D LIDAR and 360-degree image sensors, the backup camera requirement is obviously utterly negligible.

Remember, we had TV for most of the 20th century, even before integrated circuits even existed, let alone computers and RAM. We didn’t magically lose the ability to send video around without the luxury of storing hundreds of frames’ worth of data.

Yeah, at some point it makes more sense to make or grab a chip with slightly more RAM so it has more market reach, but cars are manufactured at a scale where they actually are drivers of microcontroller technology. We are talking about a few dollars for a chip in a car being sold for thousands of dollars used, or tens of thousands of dollars new.

There is just no way that adding a backup camera is an existential issue for product lines.


Not all of those systems will be running from the same hardware controllers.

Back in the mists of time, we used to do realtime video from camera to display with entirely analog components. Not that I'm eager to have a CRT in my dashboard, but live video from a local camera is a pretty low bar to clear.

Yeah, I cannot understand why people are thinking a gigabyte of RAM in this context save for their context being imagining what this would take with a python HTTPS server streaming video via WebRTC to an electron GUI running out of local docker containers or something. Because that ought to be enough memory for a hour of compressed video.

It’s like saying your family of four is going to take a vacation, so you might need to reserve an entire Hyatt for a week, rather than a single room in a Motel 6.


> I cannot understand why people are thinking a gigabyte of RAM in this context save for their context being imagining what this would take with

Who's people? It isn't me, I was rounding to the nearest positive integer. And bastawhiz is arguing in the abstract about RAM prices so I don't see how they fit this complaint either.

> It’s like saying your family of four is going to take a vacation, so you might need to reserve an entire Hyatt for a week, rather than a single room in a Motel 6.

From my point of view, it's more like each room only holds one person so you can't just say "a room" (megabyte), and renting a whole hotel would only be 0.1% of the total vacation budget, so I simplify it and just say "rent a hotel" (gigabyte). It doesn't mean I think it's necessary, it means I'm pointing out how cheap it is and don't need to go deeper.


I tried to think of a wording that wouldn't get this response, I guess I failed. Ram is generally bought in gigabytes, "1 or less" is as low as numbers go without getting overly detailed.

So what microcontroller do you have in mind that can run a 1-2 megapixel screen on internal memory? I would have guessed that a separate ram chip would be cheaper.


https://wiki.st.com/stm32mpu/wiki/How_to_display_on_HDMI

But mostly it’s the fundamental problem space from an A/V perspective. You don’t need iPhone-grade image processing - you just need to convert the raw signal from the CMOS chip to some flavor of YUV or RGB, and get that over to the screen via whatever interface it exposes.

NTSC HD was designed to be compatible with pretty stateless one-way broadcast over the air. And that was a follow-on to analog encodings that were laid down based on timing of the scanning CRT gun from dividing the power line frequency in an era where 1GB of RAM would be sci-fi. We use 29.97 / 59.94 fps from shimming color signal into 30 fps B&W back when color TV was invented in the early-mid 1900s, that’s how tight this domain is.


> https://wiki.st.com/stm32mpu/wiki/How_to_display_on_HDMI

That board has a DDR3 chip on it. Is there one with HDMI that doesn't?

> But mostly it’s the fundamental problem space from an A/V perspective. You don’t need iPhone-grade image processing - you just need to convert the raw signal from the CMOS chip to some flavor of YUV or RGB, and get that over to the screen via whatever interface it exposes.

> NTSC HD was designed to be compatible with pretty stateless one-way broadcast over the air. And that was a follow-on to analog encodings that were laid down based on timing of the scanning CRT gun from dividing the power line frequency in an era where 1GB of RAM would be sci-fi. We use 29.97 / 59.94 fps from shimming color signal into 30 fps B&W back when color TV was invented in the early-mid 1900s, that’s how tight this domain is.

If you're getting a signal that's already uncompressed TV-like then you probably don't need a processor at all. But I didn't want to assume you're getting that, running a multi-Gbps signal over a wire in a very hostile environment.

The more generic solution needs the ability to hold a couple frames in memory. Which probably means a ram chip. Please don't focus so hard on the way I rounded the number. The point was that it's a negligible number of dollars. And you can use a much smaller chip than a gigabyte, but that doesn't save a proportional amount of money and the conclusion is the same, negligible amount of dollars.

I guess I could have said "gigabit". Anything that got into specific numbers of megabytes would have been pointless detail. And it's megabytes minimum if there's a frame buffer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: