trustworthiness of ai systems
The term "trustworthiness of AI systems" refers to the degree to which artificial intelligence systems can be relied upon to perform according to expectations, while adhering to ethical norms and without introducing undue biases or risks.
Requires login.
Related Concepts (4)
Similar Concepts
- accountability and transparency in ai
- accountability in ai systems
- accountability of ai systems
- bias and fairness in ai system control
- bias in ai systems
- fairness and accountability in ai
- morality of ai systems
- robustness and reliability of ai systems
- transparency and explainability in ai systems
- transparency in ai systems
- trust and accountability in ai systems
- trust and responsibility in ai applications
- trustworthiness and reliability of ai systems
- trustworthiness and robustness in ai control
- trustworthy computing in autonomous systems