Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a huge difference in programming model. You can rely on C++ or Rust destructors to free GPU memory, close sockets, free memory owned through an opaque pointer obtained through FFI, implement reference counting, etc.

I've had the displeasure of fixing a Go code base where finalizers were actively used to free opaque C memory and GPU memory. The Go garbage collector obviously didn't consider it high priority to free these 8-byte objects which just wrap a pointer, because it didn't know that the objects were keeping tens of megabytes of C or GPU memory alive. I had to touch so much code to explicitly call Destroy methods in defer blocks to avoid running out of memory.





For GCed languages, I think finalizers are a mistake. They only serve to make it harder to reason about the code while masking problems. They also have negative impacts on GC performance.

Java is actively removing it's finalizers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: