• ben16w@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    5 months ago

    How was it supposed to work? Was it supposed to scan received dick pics of anything gross because people do have eyes they could use…

  • rmuk@feddit.uk
    link
    fedilink
    English
    arrow-up
    11
    ·
    5 months ago

    I read “daters” as “dealers” and I ran the whole gamut of emotions in about a half second.

  • Snapz@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 months ago

    I think this is a valuable app… Not the app itself, but an API that other dating apps could link to to allow you to filter out anyone with poor enough judgement to have sent photos of their crotch to his company.

  • andallthat@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I have to admit It was a solid idea, though. Dick pics should be one of the best training sets you can find on the internet and you can assume that the most prolific senders are the ones with the lowest chance of having an STI (or any real-life sexual activity).

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    2
    ·
    5 months ago

    This is the best summary I could come up with:


    HeHealth’s AI-powered Calmara app claimed, “Our innovative AI technology offers rapid, confidential, and scientifically validated sexual health screening, giving you peace of mind before diving into intimate encounters,” but now it’s shut down after an inquiry by the Federal Trade Commission (FTC).

    The letter lays out some of the agency’s concerns with the information HeHealth relied on for its claims, including one saying that it could detect more than 10 sexually transmitted infections with up to 94 percent accuracy.

    Given that most STIs are asymptomatic, according to the World Health Organization, medical professionals have questioned the reliability of the app’s tactics.

    One Los Angeles Times investigation found that Calmara couldn’t even discern inanimate objects and failed to identify “textbook images” of STIs.

    The FTC issued a civil investigative demand (similar to a subpoena) seeking information about Calmara’s advertising claims and privacy practices and put HeHealth on notice that it’s illegal to make health benefit claims without “reliable scientific evidence.”

    The FTC said it would not pursue the investigation further since HeHealth agreed to those terms and because of “the small number of Calmara users and sales in the U.S.” But, it warned, “The Commission reserves the right to take such further action as the public interest may require.”


    The original article contains 523 words, the summary contains 207 words. Saved 60%. I’m a bot and I’m open source!