Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Alloca is a fundmentally insecure way of doing allocations. Languages that promote alloca will find themselves stuck in a morass of security messes and buffer overflows. If Zig were to adopt alloca, it would make the catastrophic mistake that plagued C for over several decades and introduce permanently unfixable security issues for another generation of programming languages.


alloca is a significantly better and safer allocation strategy than the thing it supersedes (fixed size buffers). It's not great but it's definitely better.


Any thoughts on the use of strdupa()? I do not use it, but I wonder if that is dangerous too, considering it uses alloca().


I’ve been defending alloca() here, but no, strdupa() (not to be confused with shlwapi!StrDupA on Windows) is a bad idea. In cases that I think are acceptable, the size of the allocation is reasonably small and does not come from outside the program. Here you’re duplicating a string that you probably got somewhere else and don’t really control. That means you don’t really know if and when you’re going to overflow the stack, which is not a good position to be in.

(Once upon a time, MSLU, a Microsoft-provided Unicode compatibility layer for Windows 9x, used stack-allocated buffers to convert strings from WTF-16 to the current 8-bit encoding. That was also a bad idea.)


I don't have anything against alloca(), but then again, I don't use it at all. I stick to malloc() / free(), and in case of strings, asprintf().


Didn't stop rust from using it internally.


I don't think Rust uses alloca internally for anything. You may be thinking of Swift, which I think uses alloca for ABI shenanigans.


How does it do that?


I don’t know why you’re downvoted, alloca is a mistake.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: