268
submitted 10 months ago by Evkob@lemmy.ca to c/canada@lemmy.ca

Jake Moffatt was booking a flight to Toronto and asked the bot about the airline's bereavement rates โ€“ reduced fares provided in the event someone needs to travel due to the death of an immediate family member.

Moffatt said he was told that these fares could be claimed retroactively by completing a refund application within 90 days of the date the ticket was issued, and submitted a screenshot of his conversation with the bot as evidence supporting this claim.

The airline refused the refund because it said its policy was that bereavement fare could not, in fact, be claimed retroactively.

Air Canada argued that it could not be held liable for information provided by the bot.

you are viewing a single comment's thread
view the rest of the comments
[-] SamuelRJankis@lemmy.world 43 points 10 months ago

It's amazing that a 7 billion dollar company goes to court to fight someone for $800. Aside from obviously being in the wrong.

...awarding $650.88 in damages for negligent misrepresentation.

$36.14 in pre-judgment interest and $125 in fees

[-] nova_ad_vitum@lemmy.ca 40 points 10 months ago* (last edited 10 months ago)

They're not fighting for the $800. They're fighting for the right to continue to use their shitty chatbot to reduce their support staff costs while not being liable for any bullshit it tells people.

There will be cases like this in every jurisdiction.

[-] CanadianCorhen@lemmy.ca 8 points 10 months ago

Exactly.

If the court hand found any other way, then any time the chatbot makes a mistake, they just wash their hands of it and let the consumer takes the hit.

This means they are responsible for what the chatbot says, and is at least moderately sane.

[-] nova_ad_vitum@lemmy.ca 4 points 10 months ago

If the court hand found any other way, then any time the chatbot makes a mistake, they just wash their hands of it and let the consumer takes the hit.

It would have been just a matter of time the chatbot started making "mistakes" that financially benefitted the company more and more.

This means they are responsible for what the chatbot says, and is at least moderately sane.

Does this decision carry any precedent? It was a tribunal, not a court.

[-] ahal@lemmy.ca 16 points 10 months ago

Nothing to do with the money and everything to do with the precedent. Glad it didn't work out for them.

this post was submitted on 15 Feb 2024
268 points (98.6% liked)

Canada

7299 readers
1021 users here now

Rules

  1. Keep the original title when submitting an article. You can put your own commentary in the body of the post or in the comment section.

Reminder that the rules for lemmy.ca also apply here. See the sidebar on the homepage: lemmy.ca


What's going on Canada?



Related Communities


๐Ÿ Meta


๐Ÿ—บ๏ธ Provinces / Territories


๐Ÿ™๏ธ Cities / Local Communities

Sorted alphabetically by city name.


๐Ÿ’ SportsHockey

Football (NFL): incomplete

Football (CFL): incomplete

Baseball

Basketball

Soccer


๐Ÿ’ป Schools / Universities

Sorted by province, then by total full-time enrolment.


๐Ÿ’ต Finance, Shopping, Sales


๐Ÿ—ฃ๏ธ Politics


๐Ÿ Social / Culture


founded 4 years ago
MODERATORS