trustworthiness of ai systems

The term "trustworthiness of AI systems" refers to the degree to which artificial intelligence systems can be relied upon to perform according to expectations, while adhering to ethical norms and without introducing undue biases or risks.

Requires login.