• 0 Posts
  • 61 Comments
Joined 1 year ago
cake
Cake day: June 10th, 2023

help-circle



  • Exactly. The big problem with LLMs is that they’re so good at mimicking understanding that people forget that they don’t actually have understanding of anything beyond language itself.

    The thing they excel at, and should be used for, is exactly what you say - a natural language interface between humans and software.

    Like in your example, an LLM doesn’t know what a cat is, but it knows what words describe a cat based on training data - and for a search engine, that’s all you need.















  • I feel like in most cases if a product has such bad reviews that it kills the company that made it, there’s a good reason for that.

    Of course there are exceptions, and it is expected that a reviewer do their due diligence to make sure they’re giving an honest, accurate, and reasonable review, but no company should be shielded for being told their product isn’t good if it isn’t.