Just curious for other opinions... but wouldn't this (Heartbleed) sort of buffer excess read-back failure have been prevented by utilizing a "safe" language like Lisp or SML?I used to be an "unsafe" language bigot -- having mastered C/C++ for many years, and actually producing C compilers for a living at one time. I felt there should be no barriers to me as master of my machine, and not the other way around.But today's software systems are so complex that it boggles the mind to keep track of everything needed. I found during my transition years that I could maintain code bases no larger than an absolute max of 500 KLOC, and that I actually started losing track of details around 100 KLOC. Making the transition to a higher level language like SML or Lisp enabled greater productivity within those limits for me.