Using Kohler's Poop-Analysis Camera? Double-Check This Key Privacy Setting First

In October, Kohler launched Dekoda, a camera that attaches to a toilet and uses AI to examine your poop. Some say you can’t put a price on good gut health, but the Dekoda costs $599 for the device, plus a subscription fee that ranges from $70 to $156 per year.
But after a blog post published this week raised questions about Kohler’s data practices for its new toilet gadget, the company was forced to explain what it means by “encrypted” data for customers, and what its policy is for training its algorithms on their… uh… waste information. And it’s not as straightforward as it initially seemed.
Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.
On its website, Kohler says Dekoda “analyzes gut health and hydration and detects the presence of blood in the toilet bowl, providing data for building healthy habits.”
On the same webpage, Kohler highlights the gadget’s privacy features. It says that the camera only ever points down into the toilet bowl, that it offers fingerprint authentication optionally via the Dekoda remote and that, “our technology is designed to keep your personal data personal. It is end-to-end encrypted.”
But is “end-to-end” encryption as Kohler defines it what its customers might expect?
The blog post published by security researcher Simon Fondrie-Teitler raised questions about what that encryption entails and pointed out that Kohler would likely have access to the data and images collected by Dekoda.
“Responses from the company make it clear that — contrary to common understanding of the term — Kohler is able to access data collected by the device and associated application,” he wrote.
Kohler responds to privacy concerns
Kohler itself appeared to confirm this notion in a statement it shared with CNET.
“The term end-to-end encryption is often used in the context of products that enable a user (sender) to communicate with another user (recipient), such as a messaging application. Kohler Health is not a messaging application,” the statement said. “In this case, we used the term with respect to the encryption of data between our users (sender) and Kohler Health (recipient).”
The company went on to say: “We encrypt data end-to-end in transit, as it travels between users’ devices and our systems, where it is decrypted and processed to provide and improve our service. We also encrypt sensitive user data at rest, when it’s stored on a user’s mobile phone, toilet attachment and on our systems.”
In other words, the data Dekoda collects is encrypted in transit, but can be decrypted by the company on its end.
In regard to how the company uses the data for AI systems learning, Kohler said in the same statement: “If a user consents (which is optional), Kohler Health may de-identify the data and use the de-identified data to train the AI that drives our product. This consent check-box is displayed in the Kohler Health app, is optional and is not pre-checked.”
Based on Kohler’s statement, the company will remove information that pairs a user’s identity with the data before it’s used for optional AI model training.
The meaning of ‘encrypted’
This may cause confusion for people familiar with the kind of end-to-end encryption offered by services such as Signal or Apple. Here, the expectation is that companies wouldn’t have access, or even a technological way, to decrypt data that people are transmitting through their services.
What Kohler is doing differs from the expectation, as Fondrie-Teitler points out in his post: “What Kohler is referring to as E2EE here is simply HTTPS encryption between the app and the server, something that has been basic security practice for two decades now, plus encryption at rest.”
Security experts who spoke to CNET believe that the way Kohler describes “end-to-end” encryption might be confusing to customers.
Nico Dupont, the founder and CEO of the AI security company Cyborg.co called the description “very misleading.”
“While (Kohler) clarifies that the data is encrypted from the device to their servers, this process is more commonly referred to as ‘encryption in transit,’ ” Dupont said. “End-to-end encryption usually suggests a sense of privacy which is characterized by servers not having access to the data, which is not the case here. While secure, it’s not private.”
Another executive in the security industry was even more blunt.
“End-to-end encryption literally has one job and one meaning: keep the company out of the middle. If the vendor can see it, analyze it or even take it to power AI features, then it is not at all ‘end-to-end,'” said Zbyněk Sopuch, CTO of data security company Safetica
What Kohler is doing with the data, Sopuch said, isn’t unusual in the internet devices space. But referring to it in the way Kohler has is problematic and could imply more privacy than is actually happening, he said. “Encryption certainly helps prevent data interception, but it does not prevent internal or third-party access,” he said. “Data controls are really a separate issue.”
Kohler didn’t respond directly to questions about Fondrie-Teitler’s post beyond sharing the company statement.
Source: CNET













