How people use a tool is not the fault of the tool - there is an underlying issue that drives that behavior. It would be like mandating that hammers have to be soft enough that they can't damage a skull because people use them to bash in peoples heads, which yes would prevent hammers from being used as weapons but would render them ineffective at their original purpose.
I don't think that's entirely true, sometimes tools have only one purpose. It would not be ethical to manufacture nukes and sell them to people, for example.
Even with a messaging app, imagine that you created a new one, and then found that for some reason 90% of your user base is hitmen communicating with their clients. Maybe that's not your fault, but I think you would be ethically obligated to shut it down, or significantly modify it to stop enabling hitmen.
Obviously these are contrived examples, and often in real life it's impossible to make a tool that can't be used for evil. But I don't think you're devoid of responsibility just because you didn't intend for your creation to be abused. If you accidentally created something dangerous, you have an obligation to take reasonable measures to mitigate the danger.
I think a large factor in this is the range of intended uses - in the example of a nuke, it can only be used for one thing which is evil, and so there is no downside to banning it or mandating changes to the properties inherent to its existence. But tools like private messaging and hammers have a huge potential for being used for good (due to the same properties that make them useful for evil) and targeting their properties to reduce viability for evil also reduces the amount of good they can do.
All that being said, I do agree that in some cases there is a definite ethical burden on a creator to consider the impact of his creation - I just think that in many cases the best solution is not to change the tool to avoid misuse but to figure out why the misuse occurs/would occur in the first place and try to solve that. I would conjecture that the misuse more often than not points to a deeper social issue that is for some reason not being properly dealt with but which is actually a really big deal that no one wants to confront. I can think of a few examples but I think that level of exploration may be better suited to a blog post than a comment.