Air Canada chatbot
CTV News Vancouver reports Air Canada’s chatbot gave a B.C. man the wrong information. Now, the airline has to pay for the mistake:
Air Canada has been ordered to compensate a B.C. man because its chatbot gave him inaccurate information.
Good.
“In effect, Air Canada suggests the chatbot is a separate legal entity that is responsible for its own actions. This is a remarkable submission.”
Everything to dodge their responsibilities. When you use technology proven to not be functioning properly to reduce workforce, then you should be responsible for it.
Is it time to have better laws? One that make companies reponsible for this kind of “mistakes”.