Reading view

There are new articles available, click to refresh the page.

Why Are Kindle Colorsofts Turning Yellow?



In physical books, yellowing pages are usually a sign of age. But brand-new users of Amazon’s Kindle Colorsofts, the tech giant’s first color e-reader, are already noticing yellow hues appearing at the bottoms of their displays.

Since the complaints began to trickle in, Amazon has reportedly suspended shipments and announced that it is working to fix the issue. (As of publication of this article, the US $280 Kindle had an average 2.6 star rating on Amazon.) It’s not yet clear what is causing the discoloration. But while the issue is new—and unexpected—the technology is not, says Jason Heikenfeld, an IEEE Fellow and engineering professor at the University of Cincinnati. The Kindle Colorsoft, which became available on 30 October, uses “a very old approach,” says Heikenfeld, who previously worked to develop the ultimate e-paper technology. “It was the first approach everybody tried.”

Amazon’s e-reader uses reflective display technology developed by E Ink, a company that started in the 1990s as an MIT Media Lab spin-off before developing its now-dominant electronic paper displays. E Ink is used in Kindles, as well as top e-readers from Kobo, reMarkable, Onyx, and more. E Ink first introduced Kaleido—the basis of the Colorsoft’s display—five years ago, though the road to full-color e-paper started well before.

How E-Readers Work

Monochromatic Kindles work by applying voltages to electrodes in the screen that bring black or white pigment to the top of each pixel. Those pixels then reflect ambient light, creating a paperlike display. To create a full-color display, companies like E Ink added an array of filters just above the ink. This approach didn’t work well at first because the filters lost too much light, making the displays dark and low resolution. But with a few adjustments, Kaleido was ready for consumer products in 2019. (Other approaches—like adding colored pigments to the ink—have been developed, but these come with their own drawbacks, including a higher price tag.)

Given this design, it initially seemed to Heikenfeld that the issue would have stemmed from the software, which determines the voltages applied to each electrode. This aligned with reports from some users that the issue appeared after a software update.

But industry analyst Ming-Chi Kuo suggested in a post on X that the issue is due to the e-reader’s hardware. Amazon switched the optically clear adhesive (OCA) used in the Colorsoft to a material that may not be so optically clear. In its announcement of the Colorsoft, the company boasted “custom formulated coatings” that would enhance the color display as one of the new e-reader’s innovations.

In terms of resolving the issue, Kuo’s post also stated that “While component suppliers have developed several hardware solutions, Amazon seems to be leaning toward a software-based fix.” Heikenfeld is not sure how a software fix would work, apart from blacking out the bottom of the screen.

Amazon did not reply to IEEE Spectrum’s request for comment. In an email to IEEE Spectrum, E Ink stated, “While we cannot comment on any individual partner or product, we are committed to supporting our partners in understanding and addressing any issues that arise.”

The Future of E-Readers

It took a long time for color Kindles to arrive, and the future of reflective e-reader displays isn’t likely to improve much, according to Heikenfeld. “I used to work a lot in this field, and it just really slowed down at some point, because it’s a tough nut to crack,” Heikenfeld says.

There are inherent limitations and inefficiencies to working with filter-based color displays that rely on ambient light, and there’s no Moore’s Law for these displays. Instead, their improvement is asymptotic—and we may already be close to the limit. Meanwhile, displays that emit light, like LCD and OLED, continue to improve. “An iPad does a pretty damn good job with battery life now,” says Heikenfeld.

At the same time, he believes there will always be a place for reflective displays, which remain a more natural experience for our eyes. “We live in a world of reflective color,” Heikenfeld says.

This is story was updated on 12 November 2024 to correct that Jason Heikenfeld is an IEEE Fellow.

This Eyewear Offers a Buckshot Method to Monitor Health



Emteq Labs wants eyewear to be the next frontier of wearable health technology.

The Brighton, England-based company introduced today its emotion-sensing eyewear, Sense. The glasses contain nine optical sensors distributed across the rims that detect subtle changes in facial expression with more than 93 percent accuracy when paired with Emteq’s current software. “If your face moves, we can capture it,” says Steen Strand, whose appointment as Emteq’s new CEO was also announced today. With that detailed data, “you can really start to decode all kinds of things.” The continuous data could help people uncover patterns in their behavior and mood, similar to an activity or sleep tracker.

Emteq is now aiming to take its tech out of laboratory settings with real-world applications. The company is currently producing a small number of Sense glasses, and they’ll be available to commercial partners in December.

The announcement comes just weeks after Meta and Snap each unveiled augmented reality glasses that remain in development. These glasses are “far from ready,” says Strand, who led the augmented reality eyewear division while working at Snap from 2018 to 2022. “In the meantime, we can serve up lightweight eyewear that we believe can deliver some really cool health benefits.”

Fly Vision Vectors

While current augmented reality (AR) headsets have large battery packs to power the devices, glasses require a lightweight design. “Every little bit of power, every bit of weight, becomes critically important,” says Strand. The current version of Sense weighs 62 grams, slightly heavier than the Ray-Ban Meta smart glasses, which weigh in at about 50 grams.

Because of the weight constraints, Emteq couldn’t use the power-hungry cameras typically used in headsets. With cameras, motion is detected by looking at how pixels change between consecutive images. The method is effective, but captures a lot of redundant information and uses more power. The eyewear’s engineers instead opted for optical sensors that efficiently capture vectors when points on the face move due to the underlying muscles. These sensors were inspired by the efficiency of fly vision. “Flies are incredibly efficient at measuring motion,” says Emteq founder and CSO Charles Nduka. “That’s why you can’t swat the bloody things. They have a very high sample rate internally.”

Sense glasses can capture data as often as 6,000 times per second. The vector-based approach also adds a third dimension to a typical camera’s 2D view of pixels in a single plane.

These sensors look for activation of facial muscles, and the area around the eyes is an ideal spot. While it’s easy to suppress or force a smile, the upper half of our face tends to have more involuntary responses, explains Nduka, who also works as a plastic surgeon in the United Kingdom. However, the glasses can also collect information about the mouth by monitoring the cheek muscles that control jaw movements, conveniently located near the lower rim of a pair of glasses. The data collected is then transmitted from the glasses to pass through Emteq’s algorithms in order to translate the vector data into usable information.

In addition to interpreting facial expressions, Sense can be used to track food intake, an application discovered by accident when one of Emteq’s developers was wearing the glasses while eating breakfast. By monitoring jaw movement, the glasses detect when a user chews and how quickly they eat. Meanwhile, a downward-facing camera takes a photo to log the food, and uses a large language model to determine what’s in the photo, effectively making food logging a passive activity. Currently, Emteq is using an instance of OpenAI’s GPT-4 large language model to accomplish this, but the company has plans to create their own algorithm in the future. Other applications, including monitoring physical activity and posture, are also in development.

One Platform, Many Uses

Nduka believes Emteq’s glasses represent a “fundamental technology,” similar to how the accelerometer is used for a host of applications in smartphones, including managing screen orientation, tracking activity, and even revealing infrastructure damage.

Similarly, Emteq has chosen to develop the technology as a general facial data platform for a range of uses. “If we went deep on just one, it means that all the other opportunities that can be helped—especially some of those rarer use cases—they’d all be delayed,” says Nduka. For example, Nduka is passionate about developing a tool to help those with facial paralysis. But a specialized device for those patients would have high unit costs and be unaffordable for the target user. Allowing more companies to use Emteq’s intellectual property and algorithms will bring down cost.

In this buckshot approach, the general target for Sense’s potential use cases is health applications. “If you look at the history of wearables, health has been the primary driver,” says Strand. The same may be true for eyewear, and he says there’s potential for diet and emotional data to be “the next pillar of health” after sleep and physical activity.

How the data is delivered is still to be determined. In some applications, it could be used to provide real-time feedback—for instance, vibrating to remind the user to slow down eating. Or, it could be used by health professionals only to collect a week’s worth of at-home data for patients with mental health conditions, which Nduka notes largely lack objective measures. (As a medical device for treatment of diagnosed conditions, Sense would have to go through a more intensive regulatory process.) While some users are hungry for more data, others may require a “much more gentle, qualitative approach,” says Strand. Emteq plans to work with expert providers to appropriately package information for users.

Interpreting the data must be done with care, says Vivian Genaro Motti, an associate professor at George Mason University who leads the Human-Centric Design Lab. What expressions mean may vary based on cultural and demographic factors, and “we need to take into account that people sometimes respond to emotions in different ways,” Motti says. With little regulation of wearable devices, she says it’s also important to ensure privacy and protect user data. But Motti raises these concerns because there is a promising potential for the device. “If this is widespread, it’s important that we think carefully about the implications.”

Privacy is also a concern to Edward Savonov, a professor of electrical and computer engineering at the University of Alabama, who developed a similar device for dietary tracking in his lab. Having a camera mounted on Emteq’s glasses could pose issues, both for the privacy of those around a user and a user’s own personal information. Many people eat in front of their computer or cell phone, so sensitive data may be in view.

For technology like Sense to be adopted, Sazonov says questions about usability and privacy concerns must first be answered. “Eyewear-based technology has potential for a great future—if we get it right.”

❌