We've been stating for years 'developers need to learn to code securely' sure this is great, however is essentially limited to skilled professionals. This isn't to say we shouldn't keep teaching however rather than simply focusing on those paying attention we should start babysitting the remaining majority.
So how do you watch what a developer is doing? One of the things that needs to happen is to build better libraries and frameworks (yes this statement sounds very marketechture but bear with me). Java stopped the overflow issues (minus specific VM issues), and Microsoft's .NET has followed in Java's tracks and done the same. Microsoft's .NET has also done one better and made development of vulnerable ASP.NET web applications harder. ASP.NET detects if html is being taken in a user modifiable input, and if this input is echoed checks to see if HTML has been injected. If it detects HTML Injection (usually an XSS attack) it prevents the application from behaving 'vulnerably' by halting it's execution, and displaying a warning message.
I always hear the argument 'people who write applications vulnerable to buffer overflows, sql injection or cross site scripting shouldn't be writing code!' and its a nice fantasy! New people are always learning to code, being put into situations to develop things maybe they shouldn't be and this isn't going to ever stop. The majority of skilled developers start out the same way and faulting them for 'learning the ropes' is just plain stupid. We need to start hand holding what developers are doing by preventing them (by default) from making common security mistakes. Just as important we need to provide overrides for those who 'know what their doing', because hindering application development isn't going to fly. As mentioned above Java and Microsoft's . NET Framework allow you to write unmanaged code if there's a need, however by default manages it to prevent those darn buffer overflows from 'magically appearing'.