- Joined
- Sep 20, 2018
- Posts
- 4,525
- Qantas
- Platinum
No not racist. But a reality for a 91 year old with heating aids.This almost sounds racist
No not racist. But a reality for a 91 year old with heating aids.This almost sounds racist
Where we'll probably end up is having multiple AI agents behind the scenes that co-operate. The first one helps you with high level planning about where to go and for how long,, a second one to fill in the details, and then a third one who verifies things and makes sure the first two weren't hallucinating.Yeah, fair points about double-checking the details. I always verify anything important, like flight times or bookings, through the actual airline/hotel site anyway. I don't use it to get flight or hotel pricing, as I find it's not reliable. AI is brilliant for the planning stage, though, like helping me figure out rough itineraries or what areas to stay in.
The AI blocker thing isn't really how it works, from what I understand. Those are more about stopping companies from scraping websites for training data, not about blocking us from using ChatGPT or Claude to access their website info as a personal travel assistant. But I get the caution around accuracy, that's definitely real and worth keeping in mind.
I find it's a massive time-saver for the initial research phase. Then I treat it like I would any other travel blog or forum post; helpful info, but always verify the specifics yourself before committing to anything.
AI, like many things in life is overrated.Google is more interested in haresveting data about your spending habits than helping you plan anything.I tried to use google AI to plan for an upcoming holiday next month, it kind of worked but i'll do my own research as well.
Tried to use to search for award flight availability but no luck.
theconversation.com
I’ve found the same. I occasionally give it another go, hoping it’s improved, but it seems to consistently come up with incorrect information. I think the smug confidence with which the misinformation is delivered is the most infuriating part of these AI systems. If you point out an error, there’s a grovelling apology and then more misinformation is spat out. You really have to already be knowledgeable about the subject to pick up on the errors, so it seems pretty pointless.I'm regularly finding that ChatGPT/AI tools still routinely spit out incorrect travel-related information. Like the other day, Google's AI overview suggested going for Qantas status over Velocity status because Qantas offers family pooling.
The annoying thing is that the incorrect information is often presented with an unjustified tone of confidence, and people believe it. I regularly get questions from people saying they read something on AI, and why isn't the thing AI told them to do working. The reason is usually that AI gave them bad advice.
I personally wouldn't rely on AI for travel advice, at least not yet.
AFF Supporters can remove this and all advertisements
And it all becomes merry-go-round........the incorrect/inaccurate/misleading information that is in an article automatically becomes a training source for the AI models, so the AI models eat it up and use it to further train themselves, but they are feeding on wrong information and they don't know it. But they then reproduce this new learning as more fact in a new article that goes back into the loop to feed and train the AI models.I’ve found the same. I occasionally give it another go, hoping it’s improved, but it seems to consistently come up with incorrect information. I think the smug confidence with which the misinformation is delivered is the most infuriating part of these AI systems. If you point out an error, there’s a grovelling apology and then more misinformation is spat out. You really have to already be knowledgeable about the subject to pick up on the errors, so it seems pretty pointless.
Sadly, so many publications are using AI to generate travel stories now, so those errors are creeping into publications that were traditionally more trustworthy. I read one article in a magazine recently that announced new accommodation at Cradle Mountain, making a big deal about how there hasn’t been new accommodation opening at Cradle for a long time. And yet when you read further, it said the accommodation was on Lake St Clair, which is either a 6 day walk away or a 3 hour winding drive down the West Coast. It was a very AI-style error and I was surprised that an established lifestyle magazine had published such slop.
