Maybe I think to simply, but I think that not giving AI survival instinct would solve lot of problems. Giving it no personal goals.
We cannot teach it ethics as we don’t understand ethics ourselves, and cannot agree on the matter.
Maybe I think to simply, but I think that not giving AI survival instinct would solve lot of problems. Giving it no personal goals.
We cannot teach it ethics as we don’t understand ethics ourselves, and cannot agree on the matter.