Mick Ashby's「 Ethical Regulator Theorem 」from Cybernetics implies the「 Law of Inevitable Ethical Inadequacy 」which states
If you do not specify that you require a secure ethical system, what you get is an insecure unethical system
It is essential to consider and specify an ethical framework because, if we don't, then we are guaranteed to behave unethically.
Almost anything can be abused to harm someone, but I will not design anything intended to harm or do damage, or that would inevitably lead to harm.
This means I don't make weapons, but it also means that in anything I do, I consider consequences and make my best effort to minimise forseeable adverse consequences.
Usability
Anything we make should be as usable as possible by as many potential users as possible.
Among other things, this encompasses what many would consider accessibility BUT with the key distinction that it's not a tacked-on afterthought, but designed in through-and-through at every level and from the beginning.
Autonomy
As much as possible, things should get out of the way and not limit choice or constrain options. Automation is great when it works but there must always be a way for any automation to get out of the way quickly and effortlessly when it doesn't.
Privacy
Unless essential for their functions, systems should be designed to maximise privacy of anyone who uses or interacts with them.
Delight
If we go to the trouble of making something, we should aim to make it a joy to use.
It should look beautiful, feel right. It should be as easy as practical to use to accomplish your goal without getting in your way.
Respect
The common thread among all of these and what guides me is respect - by default, I respect you, I expect you to respect me, and anything we work on or make together should respect all stakeholders involved.
Respect is not fear or deference or obsequiousness and sometimes respect means being up-front and telling people things that they don't want to hear but need to.