On Tuesday, Elwood Edwards, the voice behind the online service America Online's iconic "You've got mail" greeting, died at age 74, one day before his 75th birthday, according to Cleveland's WKYC Studios, where he worked for many years. The greeting became a cultural touchstone in the 1990s and early 2000s in the early Internet era; it was heard by hundreds of millions of users when they logged in to the service and new email was waiting for them.
The story of Edwards' famous recording began in 1989 when Steve Case, CEO of Quantum Computer Services (which later became America Online—or AOL for short), wanted to add a human voice to the company's Quantum Link online service. Karen Edwards, who worked as a customer service representative, heard Case discussing the plan and suggested her husband Elwood, a professional broadcaster.
Edwards recorded the famous phrase (and several others) into a cassette recorder in his living room in 1989 and was paid $200 for the service. His voice recordings of "Welcome," "You've got mail," "File's done," and "Goodbye" went on to reach millions of users during AOL's rise to dominance in the 1990s online landscape.
Yelp, which made a name for itself giving restaurant recs, just bought an auto services website. In the company’s earnings report on Thursday, Yelp revealed that it agreed to buy RepairPal, a site for car repair estimates, for $80 million in cash. The acquisition is expected to close by the end of the year, subject […]
If you've been wanting an electric car but everything seems too expensive, there's some good news on the horizon. A whole lot of EV leases are due to expire in 2026, which should lead to something of a glut, according to data analyzed by JD Power.
We have the revised IRS clean vehicle tax credit to thank. This was revamped under the Inflation Reduction Act, and while tough new battery sourcing rules and a requirement for final assembly in North America have meant many fewer EVs are eligible for the tax credit when bought new, a loophole that considers a leased vehicle to be a commercial sale means any leased EV is eligible for the $7,500 incentive, which can now be subtracted from the price of the EV at the time of sale or leasing.
Since there's also no price cap on the EV or income cap on the buyer, leasing is often a better idea than purchasing outright when it comes to new EVs, particularly for people who are worried about long-term battery degradation. (In fact, this is an overblown fear that is not backed up by data from older EVs, other than the early Nissan Leaf, which does not have active battery cooling.)
CaHill Resources and subsidiary CAHill TECH have a mission to solve the labor gap in the trade space, with an initial focus on heavy highway, road, and bridge construction.
The company is committed to the vision of making trade-based training available to anyone, anytime. Using a digital platform and mobile application, they provide risk reduction and operational savings to construction companies that employ millions of frontline workers.
aQuiRe™ offers over 350 modules in their library, empowering users with knowledge about different subjects like Site Operations; Machine Inspection & Maintenance; OSHA & Field Safety, and much more.
In addition, aQuiRe Construction Academy is serving the “entry organizations” side of the market and offers diverse learning materials, including Module Videos, Resources, and Quizzes to cater to different learning styles. Whether a participant learns best through visual, auditory, or written means, the program provides an array of resources to support their unique needs. CaHill Resources is a certified WBE-DBE organization.
Currently, they support 29 municipal/private clients in New York State. They are also on the Eligible Training Provider List (ETPL) which will help gather students to complete the aQuiRe Construction Academy and receive construction training.
Upon the completion of a set of modules, learners earn a badge or micro-credential, signifying their achievement in the related Modules of Study. As participants progress and complete multiple micro-credentials under the same Library, they can earn different or multiple stackable credentials. These stackable credentials demonstrate that participants have acquired valuable knowledge and skills in construction training.
For these reasons and more, CaHill Resources earned a Cool Tool Award (finalist) for “Best Badging/Credentialing Solution” as part of The EdTech Awards 2024 and aQuiRe™ earned a Cool Tool Award Winner for “Best Mobile App Solution” as part of The EdTech Awards 2022 from EdTech Digest. Learn more.
A product leader for a major edtech shares her unique perspective—and a significant opportunity for schools today.
GUEST COLUMN | by Shivani Stumpf
Supporting students in navigating the complex landscape of career options and the various pathways to turn their dreams to reality remains a significant opportunity for schools today. According to a recent survey from the ECMC Group, only 13% of students feel fully prepared to choose their path after high school.
‘…only 13% of students feel fully prepared to choose their path after high school. …How can we help students understand the spectrum of relevant postsecondary choices and empower every learner to choose their best path?‘
This is particularly crucial now, as school counselors are managing an average caseload of over 400 students. Burdened by administrative duties and critical responsibilities like behavioral and mental health interventions, today’s counselors have less time to provide one-on-one college and career guidance.
How can we help students understand the spectrum of relevant postsecondary choices and empower every learner to choose their best path? Moreover, how can we equip school counselors with the resources to enhance their efforts?
A new wave of generative AI assistants is emerging to tackle these challenges. Natural language AI tools, built responsibly, can significantly empower students to make informed decisions about their futures in a way that is personalized to their individual circumstances and actionable—while allowing counselors to concentrate on providing high-impact support.
Starting young helps to break down barriers while students explore
Today’s AI tools, when developed using Responsible AI guidelines, can significantly enhance this process. For elementary school students, engaging in play-based interactions allows them to explore careers while accessing age-appropriate assessments that help identify their interests. Additionally, when teachers incorporate career information into classroom activities, it not only increases students’ awareness of how their learning connects to future careers but also boosts the effectiveness of this exposure, ultimately enhancing academic achievement. Understanding the relevance of what they learn is key to helping students see how it will benefit them in their careers.
High School students need more comprehensive support
As students advance through middle and high school, ongoing exploration of career pathways and participation in work-based learning opportunities, such as internships and career fairs, can help keep them on track. With an on-demand, personalized AI assistant, whenever they encounter a new career of interest, they can interact with the tool to gain a better understanding of the role, including its responsibilities, salary, demand, and advice on how to pursue that career path.
These tools not only assist students in discovering potential careers and colleges, but also empower them to apply for financial aid and identify scholarships that align with their achievements and aspirations.
For instance, PowerBuddy for College and Career, the responsibly-built generative AI assistant integrated within college, career, and life readiness solution, Naviance — one of the most widely used CCLR solutions in the country— provides students with personalized guidance based on a multitude of factors. These factors include GPA, assessments, career interests, location preferences, aptitudes, personal goals, military interests, and scholarship qualifications. With PowerBuddy, students can craft a personalized postsecondary plan that highlights their ideal careers, the necessary skills, certifications, trainings, and education, and specific pathways to achieve their goals. This comprehensive support not only enhances decision-making but also paves the way for their future success.
Improving access to school resources can also boost engagement
AI is redefining how districts interact with their communities and stakeholders. Today’s AI tools can easily integrate into a district or school website, including parent portals, communication platforms, student information systems, e-learning platforms, analytics tools, and community engagement sites. AI assistants can help students and caregivers find information about policies, athletic schedules, after-school programs, student handbooks, school calendars, lunch menus, job postings and more.
These tools represent a significant leap in empowering people with efficient, secure, and personalized access to critical information. Through natural language interactions, they can eliminate what has traditionally been a cumbersome barrier for students and families — time-consuming searches sifting through information online, or phone calls that tie up school staff.
AI also offers accommodations such as speech-to-text, text-to-speech, and speech-to-speech functionality — and the ability to operate in dozens of languages — which can help schools provide equitable access for all users.
One of our recent survey findings revealed that families, particularly mothers, play a significant role in their children’s post-secondary decisions. AI tools can increase access to the information available to parents, aiding them as they guide their children through various options. Furthermore, the capability of AI to provide this information in the languages spoken at home is crucial for increasing access and support.
AI can help students maximize their full potential
Achieving any goal is rarely a straightforward journey. When students are informed about a variety of career opportunities, they can pivot and explore different paths to discover the best fit for themselves. With an AI assistant that comprehends their specific educational and career journeys, students will receive enhanced, personalized support in evaluating their options and making informed decisions about their futures.
The power of AI is already making its way into schools as leaders realize its potential. According to our own 2024 Education Focus Report, 70% of district leaders now believe AI can enhance teaching and learning — up from 53% in 2023 — and 60% of school leaders and educators believe AI can enhance teacher practice and development.
These tools, when developed and used responsibly, hold remarkable potential to help young learners reach their goals—and often inspire them to aim even higher. In this sense, AI is not merely an accessory for a progressive school district; it is a fundamental element in improving educational outcomes and fostering meaningful engagement for everyone.
—
Shivani Stumpf is Chief Product and Innovation Officer at PowerSchool. Connect with Shivani on LinkedIn.
Archon Biosciences, a biotech startup putting AI to work designing novel biomolecules, has just emerged from stealth with an impressive $20 million in seed funding. The company aims to supercharge antibody treatments using specially designed protein “cages” that multiply their effects, opening up new opportunities in drug development. This is the first company to be […]
Several people pointed me to an interesting Instagram video AMA (ask me anything) by Meta CTO Andrew Bosworth on October 21, 2024, that appeared to challenge my October 6th article, Meta Orion AR Glasses (Pt. 1 Waveguide), which discussed both transparency and “Eye Glow” (what Bosworth Referred to as “Blue Glow”) — Challenge Accepted.
On the right is a Google Search for “Meta” [and] “Orion” [and] “Eye Glow” OR “Blue Glow” from Sept 7th (Orion announced) through Oct 28, 2024. Everything pertinent to the issue was from this blog or was citing this blog. A Google Search for “Meta Orion” and “blue glow” returns nothing. Shown on the right is a Google search.
As far as I can find, this blog (and a few other sites citing this blog) has been the only one reporting on Meta Orion’s transparency or Eye Glow. So when Bosworth said, “Another thing that was kind of funny about the reviews is people were like, oh, you know you can see the blue glow well,” who else could he be referring to?
Housekeeping – Parts 2 and 3 of the Snap and Orion Roundtable are to be released soon.
The rest of the two-hour roundtable discussion about Snap Spectacles and Meta Orion should be released soon. Part 2 will focus on Meta Orion. Part 3 will discuss more applications and market issues, along with some scuttlebutt about Meta’s EMG wristband controller.
Bosworth’s Statement on Transparency and Eye Glow in Instagram AMA Video – Indirect Shout Out to this Blog
Below is computer transcription with minor light to clean up the speech-to-text and add punctuation and capitalization) of Bosworth’s October 21, 2024, AMA on Instagram, starting at about 14:22 into the video.
14:22 Question: What % of light does Orion block from your view of the world, how much is it darkened?
I don’t know exactly. So, all glass limits transmission to some degree. So, even if you have completely clear glasses, you know, maybe they take you from 100% transmission up your eyes like 97% um, and normal sunglasses that you have are much darker than you think they’re like 17% transmissive is like a standard for sunglasses. Orion is clear. It’s closer [to clear], I don’t know what the exact number is, but it’s closer to regular prescription glasses than any kind of glasses [in context, he sounds like he is referring to other AR glasses]. There’sno tint on it [Orion]. We did put tint on a couple of demo units so we could see what that looked like, but that’s not how they [Orion] work.
I won’t get into the electrochromic and that kind of stuff. Some people were theorizing that they were tinted to increase contrast. This is not uncommon [for AR] glasses. We’re actually quite proud that these were not. If I was wearing them, and you’re looking at my eyes, you would just see my eyes.
Note that Bosworth mentioned electrochromic [dimming] but “won’t get into it.” As I stated in Orion Part 1, I believe Orion has electrochromic (electrically controlled) dimming. While not asked, Bosworth gratuitously discusses “Blue Glow,” which in context can only mean “Eye Glow.”
Another thing that was kind of funny about the reviews is people were like, oh, you know you can see the blue glow well. What we noticed was so funny was the photographers from the press who were taking pictures of the glasses would work hard to get this one angle, which is like 15 degrees down and to the side where you do see the blue glow. That’s what we’re actually shunting the light to. If you’re standing in front of me looking at my eyes, you don’t see the glow, you just see my eyes. We really worked hard on that we’re very proud of it.
But of course, if you’re the person who’s assigned by some journalist outfit to take pictures of these new AR glasses, you want to have pictures that look like you can see something special or different about them. It was so funny as every Outlet included that one angle. And if you look at them all now, you’ll see that they’re all taken from this one specific down and to the side angle.
As far as I can find (it’s difficult to search), this blog is the only place that has discussed the transparency percentage of Orion’s glasses (see: Light Transmission (Dimming?)). Also, as discussed in the introduction, this blog is the only one discussing eye glow (see Eye Glow) in the same article. Then, consider how asking about the percentage of light blockage caused Bosworth to discuss blue [eye] glow — a big coincidence?
But what caused me to write this article is the factually incorrect statement that the only place [the glow] is visible is from “15 degrees down and to the side.“ He doth protest too much, methinks. Most graciously
Orion’s Glow is in pictures taken from more than “taken from this one specific down and to the side angle”
To begin with, the image I show in Meta Orion AR Glasses (Pt. 1 Waveguide), shows a more or less straight-on shot from a video by The Verge (right). It is definitely not shot from a “down and to the side angle.”
In fact, I was able to find images with Bosworth in which the camera was roughly straight on, from down and to the side, and even looking down on the Orion glasses Bosworth’s Sept. 25, 2024, Instagram video and in Adam Savage’s Tested video (far right below).
In the same The Verge Video, there is eye-glow with Mark Zuckerburg looking almost straight on into the camera and from about eye level to the side.
The eye glow was even captured by the person wearing another Orion headset when playing a pong-like game. The images below are composites of the Orion camera and what was shown in the glasses; thus, they are simulated views (and NOT through the Orion’s waveguide). The stills are from The Verge (left) and CNBC (right).
Below are more views of the eye-glow (mostly blue in this case) from the same The Verge video.
The eye glow stills frames below were captured from a CNBC video.
Here are a few more examples of eye glow that were taken while playing the pong-like game from roughly the same location as the CNBC frames above right. They were taken from about even with the glasses but off to the side.
In summary, there is plenty of evidence that the eye glow from Meta’s Orion can be seen from many different angles, not just from below but also from the side, as Bosworth states.
Meta Orion’s Transparency and Electrochromic Dimming
Bosworth’s deflection on the question of Orion’s light transmission
Bosworth started by correctly saying that nothing manmade is completely transparent. A typical (uncoated) glass reflects about 8% of the light. Eyeglasses with good antireflective coatings reflect about 0.5%. The ANSI/ISEA Z87.1, safety glasses standard, specifies “clear” as >85% transmission. Bosworth appears to catch himself knowing that there is a definition for clear and says that Orion is “closer to clear” than sunglasses at about 17%.
Bosworth then says there is “no tint” in Orion, but respectfully, that was NOT the question. He then says, “I won’t get into the electrochromic and that kind of stuff,” which is likely a major contributor to the light transmission. Any dimming technology I know of is going to block much more light than a typical waveguide. The transparency of Orion is a function of the waveguide, dimming layer, other optics layers, and inner and outer protection covers.
Since Bosworth evaded answering the question, I will work through it and try to get an answer. The process will include trying to figure out what kind of dimming I think Orion uses.
What type of electrochromic dimming is Orion Using?
First, I want to put in context what my first article was discussing regarding Orion’s Light Transmission (Dimming?). I was well aware that diffractive waveguides, even glass ones, alone are typically about 85-90% transmissive. From various photographs, I’m pretty sure Orion has some form of electrochromic dimming, as I stated in the first article. I could see the dimming change in one video, and in view of the exploded parts, there appeared to be a dimming device. In looking at this figure, the dimming device seems fairly transparent and on the order of the waveguides and other flat optics. What I was trying to figure out was whether they were using more common polarization-based dimming or a non-polarization-based technology. This picture is inconclusive as to the type of dimming that is used, as the dimmer identified (by me) might be only the liquid crystal part of the shutter with the polarizers, if there are any, in the cover glass or not shown.
The Magic Leap 2 (see: Magic Leap 2 (Pt. 3): Soft Edge Occlusion, a Solution for Investors and Not Users). Polarization-based dimming is fast and gives a very wide range of dimming (from 10:1 to >100:1), but it requires the real-world light first to be polarized, and when everything is considered, it blocks more than 70% of the light. It’s also possible to get somewhat better transmission by using semi-polarizing polarizers, but it gives up a lot of dimming range to gain some transmission. Polarization also causes issues when looking at LCDs, such as computer monitors and some cell phones.
Non-polarization dimming (see, for example, CES & AR/VR/MR Pt. 4 – FlexEnable’s Dimming, Electronic Lenses, & Curved LCDs) blocks less light in its most transmissive state but has less of a dimming range. For example, FlexEnable has a dimming cell that ranges between ~87% transmissive to 35% or less than a 3:1 dimming range. Snap Spectacles 5 uses (based on a LinkedIn post that has since been removed) a non-polarization-based electrochromic dimming by Alphamicron, what they call e-Tint. Both AlphaMicron’s e-Tint and FlexEnable’s dimming use what is known as Guest-Host LC, which absorbs light rather than changing polarization.
Assuming Orion uses non-polarization dimming, I would assume that the waveguide and related optical surfaces have about 85-90% transmissivity and about 70% to 80% for non-polarization dimming. Since the two effects are multiplicative, that would put Orion in the 90%x80% = 72% to 85 x70% = 60% range.
Orion’s Dimming
Below are a series of images from videos by CNET, The Verge, and Bloomberg. Notice that CNET’s image appears to be much more transmissive. On both CNET and The Verge, I included eye glow pictures from a few frames in the video later to prove both glasses were turned on. CNET’s Orion glasses are significantly more transparent than any other Orion video I have seen (from over 10 I have looked at to date), even when looking at the same demos as in the videos. I missed this big difference when preparing my original article and only discovered it when preparing this article.
Below are some more frame captures on the top row. On the bottom row, there are pictures of the Lumus Maximus (the most transparent waveguide I have seen), WaveOptic Titan, The Magic Leap One (with no tint), and circular polarizing glasses for comparison. The circular polarizing glasses are approximately what I would expect if the Orion glasses were using polarizing dimming.
Snap Spectacles 5, which uses non-polarization dimming, is shown on the left. It compares reasonably well to the CNET mage. Based on the available evidence, it appears that Orion must also be using an electrochromic dimming technology. Per my prior estimate, this would put Orion’s best-case (CNET) transparency in the range of 60-70%
What I don’t know is why CNET was so much more transparent than the others, even when they appear to be in similar lighting. My best guess is that the dimming feature was adjusted differently or disabled for the CNET video.
Why is Orion Using Electronic Dimming Indoors?
All the Orion videos I have seen indicate that Orion is adding electrochromic dimming when indoors. Even bright indoor lighting is much less bright than sunlight. Unlike Snap Spectacles 5 (with electronic dimming) demos, Meta didn’t demo the unit outdoors. There can be several reasons, including:
The most obvious reason is the lack of display brightness.
For colors to “pop,” they need to be at least 8x brighter than the surroundings. Bright white objects in a well-lit room could be more than 50 nits. Maybe they couldn’t or didn’t want to go that bright for power/heat reasons.
Reduced heat of the MicroLEDs
Saves on battery life
Thinking about this issue made me notice that the walls in the demo room are painted a fairly dark color. Maybe it was a designer’s decision, but it also goes to my saying, “Demos is a Magic Show,” and that darker walls would make the AR display look better.
When this is added up, it suggests that the displays in the demos were likely outputting about 200 nits (just an educated guess). While ~200 nits would be a bright computer monitor, colors would be washed out in a well-lit room when viewed against a non-black background (monitors “bring their own black background”). Simply based on how they demoed it, I suspect that Snap Spectacles 5 is four to eight times brighter than Orion with the dimming used to work outdoors (rather than indoors).
Conclusion and Comments
When I first watched Bosworth’s video, his argument that the eye glow could only be seen from one angle seemed persuasive. But then I went back to check and could easily see that what he stated was provably false. I’m left to speculate as to why he brought up the eye glow issue (as it was not the original question) and proceeded to give erroneous information. It did motivate me to understand Orion better😁.
Based on what I saw in the CNET picture and what is a reasonable assumption for the waveguide, non-polarizing dimmer, and other optics (with transparency being multiplicative and not additive), it pegs Orion in the 60% transparency range plus or minus about 5%.
Bosworth’s answer on transparency was evasive, saying there was no “tint,” which was a non-answer. He mentioned electrochromic dimming but didn’t say for sure that Orion was using it. In the end, he said Orion was closer to prescription glasses (which are about 90% uncoated, 99.5% with anti-reflective coatings) than sunglasses at 17%. If we take uncoated glasses at 90% and sunglasses at 17%, then the midpoint between them would be 53% so that Orion may be, at best, slightly closer to uncoated eyeglasses than sunglasses. There are waveguide-based AR glasses that are more transparent (but without dimming) than Orion.
Bosworth gave more of an off-the-cuff AMA and not a formal presentation for a broad audience, and some level of generalization and goofs are to be expected. While he danced around the transparency issue a bit, it was the “glow” statement and its specificity that I have more of an issue with.
Even though Bosworth is the CTO and head of Meta’s Reality Labs, his background is in software, not optics so that he may have been ill-informed rather than deliberately misleading. I generally find him likable in the videos, and he shares a lot of information (while I have met many people from Meta’s Reality Labs, I have not met Bosworth). At the same time, it sounds to my ear that when he discusses optics, he is parroting things things he has been told, sometimes without fully understanding what he is saying. This is in sharp contrast to, say, Hololen’s former leader, Alex Kipman, who I believe out and out lied repeatedly.
Working on this article caused me to reexamine what Snap Spectacles was using for dimming. In my earlier look at AlphaMicron, I missed that AlphaMicron’s “e-Tint®” was a Guest Host dimming technology rather than a polarization-based one.
From the start, I was pretty sure Orion was using electrochromic dimming, but I was not sure whether it was polarization or non-polarization-based. In working through this article, I’m now reasonably certain it is a non-polarization-based dimming technology.
Working through this article, I realized that the available evidence also suggests that Orion’s display is not very bright. I would guess less than 200 nits, or at least they didn’t want to drive it brighter than that for very long.
Appendix: Determining the light blocking from videos is tricky
Human vision has a huge dynamic range and automatically adjusts as light varies. As Bosworth stated, typical sunglasses are less than 75% transmissive. Human perception of brightness is somewhat binary logarithmic. If there is plenty of available light, most people will barely notice a 50% dimming.
When wearing AR glasses, a large percentage (for some AR headsets, nearly all) of the light needed to view the eye will pass through the AR lens optics twice (in and back out). Because light blocking in series is multiplicative, this can cause the eyes to look much darker than what the person perceives when looking through them.
I set up a simple test using Wave Optic’s waveguide, which is ~85% transmissive, circular polarizing glasses (for 3-D movies) that was 33% tranmissive, and a Magic Leap One waveguide (out of the frame) that was 70% transmissive. In the upper right, I have shown a few examples of where I had a piece of white paper far enough away from the lens that the lens did not affect the illumination of the paper. On the lower right, I moved the paper up against the lens so the paper was primarily illuminated via the lens to demonstrate the light-blocking squared effect.
Orion’s Silicon Carbide (SiC) is not significantly more transparent than glass. Most of the light blocking in a diffraction waveguide comes from the diffraction grating, optical coatings, and number of layers. Considering that Orion’s “hero prototype” with $5B in R&D expenses for only 1,000 units, it is probably more transparent by about 5%.
When looking at open glasses like Orion (unlike, say, Magic Leap or Hololens), the lenses block only part of the eye’s illumination, so you get something less than the square law effect. So, in judging the amount of light blocking, you also have to estimate how much light is getting around the lenses and frames.
On Monday, Microsoft came out guns blazing, posting a blog accusing Google of "dishonestly" funding groups conducting allegedly biased studies to discredit Microsoft and mislead antitrust enforcers and the public.
In the blog, Microsoft lawyer Rima Alaily alleged that an astroturf group called the Open Cloud Coalition will launch this week and will appear to be led by "a handful of European cloud providers." In actuality, however, those smaller companies were secretly recruited by Google, which allegedly pays them "to serve as the public face" and "obfuscate" Google's involvement, Microsoft's blog said. In return, Google likely offered the cloud providers cash or discounts to join, Alaily alleged.
The Open Cloud Coalition is just one part of a "pattern of shadowy campaigns" that Google has funded, both "directly and indirectly," to muddy the antitrust waters, Alaily alleged. The only other named example that Alaily gives while documenting this supposed pattern is the US-based Coalition for Fair Software Licensing (CFSL), which Alaily said has attacked Microsoft's cloud computing business in the US, the United Kingdom, and the European Union.
There is a significant shortage of full-time teachers around the country these days. So, to keep classrooms staffed, schools use substitute teachers to fill in for teachers when they are sick or can’t come to school. However, substitute teaching jobs are tough to recruit for, manage, and fill. One complication is that quite often, a teacher absence isn’t discovered until that morning when schools scramble to post an opening. This uncontrollable and unavoidable reality coupled with the dramatic shortage of substitutes who are qualified and ready to go is a significant problem in our nation’s schools.
The process of how to fill short-term or even long-term teacher absences with substitute teachers has not changed in 50+ years. Even the modern communication modes of emails, text messages and robo-phone calls are not any more effective at solving the problem for the angst-driven and uncertain process of finding substitute teachers.
Enter Swing. Swing brings a 21st century solution to a 20th century problem. Swing leverages social communities and technology to not only improve communication but put power back into the hands of substitutes and the schools that need them. Instead of scrambling to find anyone who will answer the phone – or for the sub – being completely in the dark about if there is only one opening today or ten, now both stakeholders have power: schools have power to match the best person for a particular need, and subs have the power to pick the best job for their day or week ahead.
For these reasons and more, Swing Education earned a Cool Tool Award (finalist) for “Best Administrative Solution” as part of The EdTech Awards 2024 from EdTech Digest. Learn more.
SAN FRANCISCO—On Tuesday, TED AI 2024 kicked off its first day at San Francisco's Herbst Theater with a lineup of speakers that tackled AI's impact on science, art, and society. The two-day event brought a mix of researchers, entrepreneurs, lawyers, and other experts who painted a complex picture of AI with fairly minimal hype.
The second annual conference, organized by Walter and Sam De Brouwer, marked a notable shift from last year's broad existential debates and proclamations of AI as being "the new electricity." Rather than sweeping predictions about, say, looming artificial general intelligence (although there was still some of that, too), speakers mostly focused on immediate challenges: battles over training data rights, proposals for hardware-based regulation, debates about human-AI relationships, and the complex dynamics of workplace adoption.
The day's sessions covered a wide breadth of AI topics: physicist Carlo Rovelli explored consciousness and time, Project CETI researcher Patricia Sharma demonstrated attempts to use AI to decode whale communication, Recording Academy CEO Harvey Mason Jr. outlined music industry adaptation strategies, and even a few robots made appearances.
Een longitudinaal onderzoek, uitgevoerd door wetenschappers van de Erasmus Universiteit werpt nieuw licht op factoren die bijdragen aan studiesucces in het hoger onderwijs. Positieve relaties tussen student en docent en tussen studenten onderling hebben een gunstig effect hebben op de studieprestaties, blijkt uit de resultaten.
Dit onderzoek wordt nauwelijks gedaan in het hoger onderwijs
De rol van studie-inzet en studiebetrokkenheid als verklarende factoren voor studieprestaties was tot nu toe onderbelicht. Terwijl er veel onderzoek wordt gedaan naar leerresultaten in het basis- en voortgezet onderwijs, is er in het hoger onderwijs nog nauwelijks onderzoek verricht naar de invloed van de relatie tussen student en docent op het studiesucces, schrijven de onderzoekers.
Om hierin meer inzicht te krijgen, voerden ze een longitudinale studie uit onder 613 bachelor-studenten van alle dertien Nederlandse openbare universiteiten. De studenten vulden op twee momenten, met een tussenperiode van drieënhalve maand, online vragenlijsten in. Hierin werd gevraagd naar hun relaties met docenten en medestudenten, en hun studie-inzet, studiebetrokkenheid en studieresultaten. De relaties met docenten werden gemeten door te kijken naar zowel formele interacties, bijvoorbeeld het bespreken van de lesstof, als informele contacten.
Bij de relaties met medestudenten ging het onder meer om het samen bespreken van opdrachten en de mate van verbondenheid. Studie-inzet werd gemeten aan de hand van vragen over de bereidheid hard te werken en aandacht te besteden aan de studie. Studiebetrokkenheid werd gemeten met vragen over inspiratie, toewijding en absorptie in het studiewerk.
Zich meer betrokken voelen
Studenten die het contact met docenten en medestudenten als positiever ervaren, werken harder voor hun studie en voelen zich meer betrokken bij hun opleiding, blijkt uit de resultaten. Dit leidt vervolgens tot betere studieresultaten: de onderzoeksresultaten tonen significante indirecte effecten van de kwaliteit van de relaties op de studieresultaten, via studie-inzet en studiebetrokkenheid.
Interessant is dat de effecten consistent bleken over verschillende groepen studenten heen. Of het nu ging om eerstejaars of derdejaars, mannen of vrouwen, studenten met of zonder migratieachtergrond, het patroon was steeds vergelijkbaar. Dit suggereert dat het gevonden mechanisme breed toepasbaar is binnen het hoger onderwijs, aldus de onderzoekers.
Stimulans door positieve ervaringen met docent
Vooral de eerste ervaringen met medestudenten hebben een langdurig positief effect op studiegedrag en –prestaties, luidt een opvallende bevinding. Positieve relaties aan het begin van de studie bleken via verhoogde inzet en betrokkenheid door te werken in latere studieprestaties. Ook bij de relaties met docenten was een direct effect zichtbaar: consistente positieve ervaringen met docenten stimuleerden studenten direct om meer inzet te tonen en betrokken te blijven bij hun studie.
Het onderzoek heeft praktische implicaties voor het hoger onderwijs, aldus de auteurs. De resultaten onderstrepen het belang van het investeren in betekenisvolle relaties tussen studenten, docenten en medestudenten om effectief leergedrag en prestaties te bevorderen. Onderwijsinstellingen zouden beleid kunnen implementeren om deze relaties te versterken, wat uiteindelijk kan leiden tot verbeterde studieprestaties.
De onderzoekers wijzen erop dat hun bevindingen ook relevant zijn voor het ontwerpen van cursussen en onderwijsactiviteiten. Docenten zouden bijvoorbeeld meer aandacht kunnen besteden aan samenwerkingsopdrachten en differentiatie om een diverse studentenpopulatie te betrekken. Dit is mogelijk door het creëren van uitdagende, complexe leertaken waarbij de talenten van verschillende studenten nodig zijn. Dit zou volgens de Rotterdamse onderzoekers de inzet en betrokkenheid kunnen stimuleren.
Erkennen en waarderen docent loont
Op organisatorisch niveau is het belangrijk om docenten te erkennen en te waarderen die pedagogische en affectieve aspecten van het onderwijs vooropstellen, schrijven de auteurs. De onderzoeksresultaten suggereren immers dat studenten betere cijfers halen als ze samenwerken, studiemateriaal bespreken en zich verbonden voelen met elkaar.
Het team van onderzoekers benadrukt dat hun studie een startpunt vormt voor verder onderzoek. In het basis- en voortgezet onderwijs wordt positief, ondersteunend contact tussen docenten en leerlingen gezien als een essentieel aspect van de leeromgeving. In het hoger onderwijs wordt het belang van dit aspect echter onderschat. Deze studie onderstreept het idee dat ondersteunende relaties met docenten, studenten kunnen motiveren om beter te presteren.
De auteurs bepleiten daarom meer aandacht voor de pedagogische en affectieve aspecten van universitair onderwijs, naast de traditionele focus op kennisoverdracht.
While the general premise of this article is that Meta Orion is using similar waveguide technology to Snap (Wave Optics) and that Magic Leap 2 is correct, it turns out that a number of assumptions about the specifics of what the various companies actually used in their products were incorrect. One of my readers (who wishes to remain anonymous) with deep knowledge of waveguides responded to my request for more information on the various waveguides. This person had both a theoretical knowledge of waveguides and what Meta Orion, Wave Optics (now Snap), Magic Leap Two, and Hololen 2 used.
My main error about the nature of waveguide “grating” structures was a bias toward linear gratings, with which I was more familiar. I overlooked the possibility that Wave Optics was using a set of “pillar” gratings that act like a 2D set of linear gratings.
A summary of the corrections:
Hololens 2 had a two-sided waveguide. The left and right expansion gratings are on opposite sides of the waveguide.
Prior Wave Optics (Snap) waveguides use a pillar-type 2-D diffraction grating on one side. There is a single waveguide for full color. The new Snap Spectacles 5 is likely (not 100% sure) using linear diffraction gratings on both sides of a single waveguide full color, as shown in this article.
Magic Leap Two uses linear diffraction gratings on both sides of the waveguide. It does use three waveguides.
The above corrections indicate that Meta Orion, Snap Spectacles 5 (Wave Optics), and Magic Leap all have overlapping linear gratings on both sides. Meta Orion and Snap likely use a single waveguide for full color, whereas the Magic Leap 2 has separate waveguides for the three primary colors.
I’m working on an article that will go into more detail and should appear soon, but I wanted to get this update out quickly.
I then wondered what Magic Leap Two (ML2) did to achieve its 70-degree FOV and uncovered some more interesting information about Meta’s Orion. The more I researched ML2, the more similarities I found with Meta’s Orion. What started as a short observation that Meta Orion’s waveguide appears to share commonality with Snap (Wave Optics) waveguides ballooned up when I discovered/rediscovered the ML2 information.
Included in this article is some “background” information from prior articles to help compare and contrast what has been done before with what Meta’s Orion, Snap/Wave Optics, and Magie Leap Two are doing.
Diffractive Waveguide Background
I hadn’t looked at in any detail how Wave Optics diffraction gratings worked differently before. All other diffraction (I don’t know about holographic) grating waveguides I had seen before used three (or four) separate gratings on the same surface of the glass. There was an Entrance Grating, a first expansion and turning grating, and then a second expansion and exit grating. The location and whether the first expansion grating was horizontal or vertical varied with different waveguides.
Hololens 2 had a variation with left and right horizontal expansion and turning gratings and a single exit grating to increase the field of view. Still, all the gratings were on the same side of the waveguide.
Diffraction gratings bend light based on wavelength, similar to a prism. But unlike a prism, a grating will bend the light in a series of “orders.” With a diffractive waveguide, only the light from one of these orders is used, and the rest of the light is not only wasted but can cause problems, including “eye glow” and reduce the contrast of the overall system
Because diffraction is wavelength-based, it bends different colors/wavelengths in different amounts. This causes issues when sending more than one color through a single waveguide/diffraction grating. These problems are compounded as the size of the exit grating and FOV increases. Several diffraction waveguide companies have one (full color), or two (red+blue and blue+green) waveguides for smaller FOVs and then use three waveguides for wider FOVs.
I want to start with a quick summary of Orion’s waveguide, as the information and figures will be helpful in comparing it to that of Wave Optics (owned by Snap and in Snap’s Spectacles AR Glasses) and the ML2.
Summary of Orion’s waveguide from the last article
Orion’s waveguide appears to be using a waveguide substrate with one entrance grating per primary color and then two expansion and exit/output gratings. The two (crossed) output gratings are on opposite sides of the Silicon Carbide (SiC) substrate, whereas most diffractive waveguides use glass, and all the gratings are on one side.
Another interesting feature shown in the patents and discussed by Meta CTO Bosworth in some of his video interviews about Orion is “Disparity Correction,” which has an extra grating used by other optics and circuitry to detect if the waveguides are misaligned. This feature is not supported in Orion, but Bosworth says it will be included in future iterations that will move the input grating to the “eye side” of the waveguide. As shown in the figure below, and apparently in Orion, light enters the waveguide from the opposite side of the eyes. Since the projectors are on the eye side (in the temples), they require some extra optics, which, according to Bosworth, make the Orion frames thicker.
Wave Optics US patent application 2018/0210205 is based on the first Wave Optics patent from the international application WO/2016/020643, first filed in 2014. FIG 3 (below) shows a 3-D representation of diffraction grating with an input grating (H0) and cross gratings (H1 and H2) on opposite sides of a single waveguide substrate.
The patent also shows that the cross gratings (H1 and H2) are on opposite sides of a single waveguide (FIG. 15B above) or one side of two waveguides (FIG. 15A above). I don’t know if Wave Optics (Snap) uses single- or double-sided waveguides in its current designs, but I would suspect it is double-sided.
While on the subject of Wave Optics waveguide design, I happen to have a picture of a Wave Optics 300mm glass wafer with 24 waveguides (right). I took the picture in the Schott booth at AR/VR/MR 2020. In the inset, I added Meta’s picture of the Orion 100mm SiC wafer, roughly to scale, with just four waveguides.
In my hurry in putting together information and digging for connection, it was looking to me that WaveOptics would be using an LCOS microdisplay. As I pointed out, WaveOptics had been moving away from DLP to LCOS with their newer designs. Subsequent information suggests that WaveOptics was still using their much older DLP design. It is still likely that future versions will use LCOS, but the current version apparently does not.
Magic Leap
Magic Leap One (ML1) “Typical” Three Grating Waveguide
This blog’s first significant article about Magic Leap was in November 2016 (Magic Leap: “A Riddle Wrapped in an Enigma”). Since then, Magic Leap has been discussed in about 90 articles. Most other waveguide companies coaxially input all colors from a single projector. However, even though the ML1 had a single field sequential color LCOS device and projector, the LED illumination sources are spatially arranged so that the image from each color output is sent to a separate input grating. ML1 had six waveguides, three for each of the two focus planes, resulting in 6 LEDs (two sets of R, G, & B) and six entrance gratings (see: Magic Leap House of Cards – FSD, Waveguides, and Focus Planes).
Below is a diagram that iFixit developed jointly with this blog. It shows a side view of the ML1 optical path. The inset picture in the lower right shows the six entrance gratings of the six stacked waveguides.
Below left is a picture of the (stack of six) ML1 waveguides showing the six entrance gratings, the large expansion and turning gratings, and the exit gratings. Other than having spatially separate entrance gratings, the general design of the waveguides is the same as most other diffractive gratings, including the Hololens 1 shown in the introduction. The expansion gratings are mostly hidden in the ML1’s upper body (below right). The large expansion and turning grating can be seen as a major problem in fitting a “typical” diffractive waveguide into an eyeglass form factor, which is what drove Meta to find an alternative that goes beyond the ML1’s 50-degree FOV.
Figure 18 from US application 2018/0052276 diagrams the ML1’s construction. This diagram is very close to the ML1’s construction down to the shape of the waveguide and even the various diffraction grating shapes.
Magic Leap Two (ML2)
The ML1 failed so badly that very few were interested in the ML2 compared to the ML1. There is much less public information about the second-generation device, and I didn’t buy an ML2 for testing. I have covered many of the technical aspects of ML2, but I haven’t studied the waveguide before. With the ML2 having a 70-degree FOV compared to the ML1’s 50-degree FOV, I became curious about how they got it to fit.
To start with, the ML2 eliminated the ML1’s support for two focus planes. This cut the waveguides in half and meant that the exit grating of the waveguide didn’t need to change the focus of the virtual image (for more on this subject, see: Single Waveguide Set with Front and Back “Lens Assemblies”).
Looking through the Magic Leap patent applications, I turned up US 2018/0052276 to Magic Leap, which shows a 2-D combined exit grating. US 2018/0052276 is what is commonly referred to in the patent field as an “omnibus patent application,” which combines a massive number of concepts (the application has 272 pages) in a single application. The application starts with concepts in the ML1 (including the just prior FIG 18) and goes on to concepts in the ML2.
This application, loosely speaking, shows how to take the Wave Optics concept of two crossed diffraction gratings on different sides of a waveguide and integrate them onto the same side of the waveguide.
Magic Leap patent application 2020/0158942 describes in detail how the two crossed output gratings are made. It shows the “prior art” (Wave Optics and Meta Orion-like) method of two gratings on opposite sides of a waveguide in FIG. 1 (below). The application then shows how the two crossed gratings can be integrated into a single grating structure. The patent even includes scanning electron microscope photos of the structures Magic Leap had made (ex., FIG 5), which demonstrates that Magic Leap had gone far beyond the concept stage by the time of the application’s filing in Nov. 2018.
I then went back to pictures I took of Magic Leap’s 2022 AR/VR/MR conference presentation (see also Magic Leap 2 at SPIE AR/VR/MR 2022) on the ML2. I realized that the concept of a 2D OPE+EPE (crossed diffraction gratings) was hiding in plain sight as part of another figure, thus confirming that ML2 was using the concept. The main topic of this figure is “Online display calibration,” which appears to be the same concept as Orion’s “disparity correction” shown earlier.
The next issue is whether the ML2 used a single input grating for all colors and whether it used more than one waveguide. It turns out that these are both answered in another figure from Magic Leap’s 2022 AR/VR/MR presentation shown below. Magic Leap developed a very compact projector engine that illuminates and LCOS panel through the (clear) part of the waveguides. Like the ML1, the red, green, and blue illumination LEDs are spatially separated, which, in turn, causes the light out of the projector lens to be spatially separated. There are then three spatially separate input gratings on three waveguides, as shown.
Based on the ML2’s three waveguides, I assumed it was too difficult or impossible to support the “crossed” diffraction grating effect while supporting full color in a single wide FOV waveguide.
Orion, ML2, and Wave Optics have some form of two-dimensional pupil expansion using overlapping diffraction gratings. By overlapping gratings, they reduce the size of the waveguide considerably over the more conventional approach, with three diffraction gratings spatially separate on a single surface.
To summarize:
Meta Orion – “Crossed” diffraction gratings on both sides of a single SiC waveguide for full color.
Snap/Wave Optics – “Crossed” diffraction gratings on both sides of a single glass waveguide for full color. Alternatively, “crossed” diffraction waveguides on two glass waveguides for full color (I just put a request into Snap to try and clarify).
Magic Leap Two – A single diffraction grating that acts like a crossed diffraction grating on high index (~2.0) glass with three waveguides (one per primary color).
The above is based on the currently available public information. If you have additional information or analysis, please share it in the comments, or if you don’t want to share it publicly, you can send a private email to newsinfo@kgontech.com. To be clear, I don’t want stolen information or any violation of NDAs, but I am sure there are waveguide experts who know more about this subject.
What about Meta Orion’s Image Quality?
I have not had the opportunity to look through Meta’s Orion or Snap Spectacles 5 and have only seen ML2 in a canned demo. Unfortunately, I was not invited to demo Meta’s Orion, no less have access to one for evaluation (if you can help me gain (legal) access, contact me at newsinfo@kgontech.com).
I have tried the ML2 a few times. However, I have never had the opportunity to take pictures through the optics or use my test patterns. From my limited experience with the ML2, it is much better in terms of image quality than the ML1 (which was abysmal – see Magic Leap Review Part 1 – The Terrible View Through Diffraction Gratings), it still has significant issues with color uniformity like other wide (>40-degree) FOV diffractive waveguides. If someone has a ML2 that I can borrow for evaluation, please get in touch with me at newsinfo@kgontech.com.
I have been following Wave Optics (now Snap) for many years and have a 2020-era Titan DLP-based 40-degree FOV Wave Optics evaluation unit (through the optics picture below). Wave Optics Titan, I would consider a “middle of the pack” (I had seen better and worse) diffractive waveguide at that time. I have seen what seem to be better diffractive waveguides before and since, but it is hard to compare them objectively as they have different FOVs, and I was not able to use my content but rather curated demo content. Wave Optics seemed to be showing better waveguides at shows before being acquired by Snap 2021, but once again, that was with their demo content with short views at shows. I am working on getting a Spectacles 5 to do a more in-depth evaluation and see how it has improved.
Without the ability to test, compare, and contrast, I can only speculate about Meta Orion’s image quality based on my experience with diffractive waveguides. The higher index of refraction of SiC helps as there are fewer TIR bounces, which degrades image quality, but it is far from a volume production-ready technology. I’m concerned about image uniformity with a large FOV and even more so with a single set of diffraction gratings as diffraction is based on wavelength (color).
There were rumors before that Meta would launch new glasses with a 2D reflective (array) waveguide optical solution and LCoS optical engine in 2024-2025. With the announcement of Orion, I personally think this possibility has not disappeared and still exists.
The “reflective waveguide” would most likely be a reference to Lumus’s reflective waveguides. I have seen a few “Lumus clone” reflective waveguides from Chinese companies, but their image quality is very poor compared to Lumus. In the comment section of my last article, Ding, on October 8, 2024, wrote:
There’s indeed rumor that Meta is planning an actual product in 2025 based on LCOS and Lumus waveguide.
Lumus has demonstrated impressive image quality in a glasses-like form factor (see my 2021 article: Exclusive: Lumus Maximus 2K x 2K Per Eye, >3000 Nits, 50° FOV with Through-the-Optics Pictures). Since the 2021 Maximus, they have been shrinking the form factor and improving support for prescription lens integration with their new “Z-lens” technology. Lumus claims its Z-Lens technology should be able to support greater than a 70-degree FoV in glass. Lumus also says because their waveguides support a larger input pupil, they should have a 5x to 10x efficiency advantage.
The market question about Lumus is whether they can make their waveguide cost-effectively in mass production. In the past, I have asked their manufacturing partner, Schott, who says they can make it, but I have yet to see a consumer product around the Z-Lens. It would be interesting to see if a company like Meta had put the kind of money they invested into complex Silicon Carbide waveguides into reflective waveguides.
While diffractive waveguides are not inexpensive, they are considered less expensive at present (except, of course, for Meta Orion’s SiC waveguides). Perhaps an attractive proposition to researchers and propriety companies is that diffraction waveguides can be customized more easily (at least on glass).
Not Addressing Who Invented What First
I want to be clear: this article does not in any way make assertions about who invented what first or whether anyone is infringing on anyone else’s invention. Making that determination would require a massive amount of work, lawyers, and the courts. The reason I cite patents and patent applications is that they are public records that are easily searched and often document technical details that are missing from published presentations and articles.
Conclusions
There seems to be a surprising amount of commonality between Meta’s Orion, the Snap/Wave Optics, and the Magic Leap Two waveguides. They all avoided the “conventional” three diffraction gratings on one side of a waveguide to support a wider FOV in an eyeglass form factor. Rediscovering that the ML2 supported “dispersion correction,” as Meta refers to it, was a bit of a bonus.
As I wrote last time, Meta’s Orion seems like a strange mix of technology to make a big deal about at Meta Connect. They combined a ridiculously expensive waveguide with a very low-resolution display. The two-sided diffraction grating Silicon Carbide waveguides seem to be more than a decade away from practical volume production. It’s not clear to me that even if they could be made cost-effective, they would have as good a view out and the image quality of reflective waveguides, particularly at wider FOVs.
Meta could have put together a headset with technology that was within three years of being ready for production. As it is, it seemed like more of a stunt in response to the Apple Vision Pro. In that regard, the stunt seems to have worked in the sense that some reviewers were reminded of seeing the real world directly with optical AR/MR beats, looking at it through camera and display.
Considerations on a conduit for creating dynamic, engaging learning environments.
GUEST COLUMN | by Justin Louder
The ongoing debate about the role of mobile phones in educational settings is multifaceted. Critics often highlight the potential for distraction, citing the fact that an overwhelming 92% of college students admit to texting during classes. This concern is real, as smartphones can certainly pull attention away from the lesson. But focusing only on distractions misses the bigger picture. Mobile technology holds real promise for education, especially outside the classroom.In today’s fast-paced world, where students frequently balance academic pursuits with work and family responsibilities, integrating mobile phones as learning tools isn’t just helpful—it’s necessary.
‘Mobile technology holds real promise for education, especially outside the classroom.’
Mobile Phones as Classroom Tools
Properly harnessed, mobile phones are more than just a potential distraction—they are a conduit for creating dynamic, engaging learning environments. In an era where digital literacy is just as important as traditional literacy, having mobile technology in the classroom is invaluable.
Educators continually seek strategies to capture and maintain student attention in lectures that can last upwards of two hours. Mobile phones, used strategically, offer a solution to this challenge. They facilitate the incorporation of instant quizzes, interactive activities, and other engagement tools that break the monotony of traditional lectures. This not only enlivens the learning experience but also fosters better retention and comprehension of course material, transforming students from passive listeners to active participants.
Furthermore, instant access to course materials via mobile phones supports any time any place learning, allowing students to draw connections within their curriculum and focus on core aspects of their studies with clarity and depth. This accessibility is a step towards accommodating diverse learning preferences and leveraging technology to fortify educational outcomes. Consequently, educational institutions should prioritize optimizing course materials for mobile platforms to support varied learning needs and amplify the effectiveness of education through technology.
Beyond the Classroom
Beyond their utility as in-class tools, mobile phones are indispensable for supporting students’ learning needs outside the walls of a traditional educational environment. For students juggling jobs, caregiving, family responsibilities, and coursework, having access to educational content on their phones turns downtime into productive learning time.
The importance of mobile accessibility is underscored by a 2023 Anthology survey, which collected insights from over 2,700 students around the world on educational access challenges. The survey revealed that 55% of respondents relied on mobile devices for their studies, highlighting the significance of smartphones as educational tools. Surprisingly, it also found that 29% of students did not own a laptop, and 57% were without a desktop computer, pointing to mobile devices as a critical bridge to learning resources for many.
Students lacking access to traditional computing devices face heightened risks of educational discontinuation. Challenges in accessing course materials or completing assignments can have severe implications for mental well-being and increase dropout risks. What might seem like minor inconveniences are, for many students, daily hurdles magnified by broader challenges, emphasizing the need for educational institutions to ensure their systems and materials are universally accessible.
In response to these realities, it’s incumbent upon educational institutions to meet students where they are—on their mobile devices. Rather than viewing smartphones merely as potential distractions, it’s crucial to recognize them as indispensable educational tools. By doing so, institutions can ensure they provide a more inclusive, engaging, and effective learning environment for all students.
—
Justin Louder is an experienced higher education and K-12 administrator and innovator focusing on online learning, student success, and pedagogy. He serves as Associate VP for Academic Innovation at Anthology. Justin earned a Doctorate in Education from Texas Tech University. Connect with Justin on LinkedIn.
A new study that examined filings with the U.S. Patent and Trademark Office by the 50 top-patenting companies cited IEEE nearly three times more than any other technical-literature publisher including ACM, Elsevier, and Springer.
“Not only do IEEE publications frequently provide the science base for new inventions, inventions that build upon IEEE publications are more likely to be valuable in the future than inventions that do not build upon IEEE.”
Patenting AI and machine learning technologies has increased tenfold in the past 10 years, but IEEE has been able to keep pace, according to the study. More than 30 percent of AI-related patents reference IEEE publications.
The report notes that in emerging markets such as blockchain, cybersecurity, and virtual and augmented reality, IEEE receives the most references.
In the robotics and intelligent manufacturing category, more than 35 percent of patent references are to IEEE literature.
This chart shows that IEEE is cited nearly three times more than any other technical-literature publisher.1790 Analytics LLC
At 30 percent, the organization also leads in citations for patents on broadcasting technologies. IEEE registered more than twice the broadcasting citations of the nearest competitor.
For autonomous vehicles, IEEE is cited 10 times more than the next publisher.
Other areas where IEEE leads in citations include measuring, testing, and control as well as transmission.
The study also found that patents referencing IEEE papers are cited more often.
“This was shown to be true for each of the 20 technology categories we examined,” the report concludes. “This suggests that not only do IEEE publications frequently provide the science base for new inventions but that inventions that build upon IEEE publications are more likely to be valuable in the future than inventions that do not build upon IEEE.”
To download the full report or for more information, visit this website.
A recent study using mathematical models suggests that breakfast choices may affect men’s and women’s metabolisms differently. Results indicate that men’s metabolisms respond better to carb-heavy breakfasts, while women’s bodies benefit more from higher-fat options like eggs or avocados, especially after fasting.
While Meta’s announced Orion prototype AR Glasses at Meta Connect made big news, there were few technical details beyond it having a 70-degree field of view (FOV) and using Silicon Carbide waveguides. While they demoed to the more general technical press and “influencers,” they didn’t seem to invite the more AR and VR-centric people who might be more analytical. Via some Meta patents, a Reddit post, and studying videos and articles, I was able to tease out some information.
This first article will concentrate on Orion’s Silicon Carbide diffractive waveguide. I have a lot of other thoughts on the mismatch of features and human factors that I will discuss in upcoming articles.
Wild Enthusiasm Stage and Lack of Technical Reviews
In the words of Yogi Berra, “It’s like deja vu all over again.” We went through this with the Apple Vision Pro, which went from being the second coming of the smartphone to almost disappearing earlier this year. This time, a more limited group of media people has been given access. There is virtually no critical analysis of the display’s image quality or the effect on the real world. I may be skeptical, but I have seen dozens of different diffractive waveguide designs, and there must be some issues, yet nothing has been reported. I expect there are problems with color uniformity and diffraction artifacts, but nothing was mentioned in any article or video. Heck, I have yet to see anyone mention the obvious eye glow problem (more on this in a bit).
The Vergecast podcast video discusses some of the utility issues and their related video, Exclusive: We tried Meta’s AR glasses with Mark Zuckerberg, which gives some more information about the experience. Thankfully, unlike Meta or any other (simulated) through-the-optics videos, The Verge clearly marked the videos as “Simulated” (screen capture on the right).
As far as I can tell, there are no true “through-the-optics” videos or pictures (likely at Meta’s request). All the images and videos I found that may look like they could have been taken through the optics have been “simulated.”
I’m not against companies making technology demos in general. However, making a big deal about a “prototype” and not a “product” at Meta Connect rather than at a technical conference like Siggraph indicates AR’s importance to Meta. It invites comparisons to the Apple Vision Pro, which Meta probably intended.
It is a little disappointing that they also only share the demos with selected “invited media” that, for the most part, lack deep expertise in display technology and are easily manipulated by a “good” demo (see Appendix: “Escape from a Lab” and “Demos Are a Magic Show”). They will naturally tend to pull punches to keep access to new product announcements from Meta and other major companies. As a result, there is no information about the image quality of the virtual display or any reported issues looking through the waveguides (which there must be).
I’ve watched hours of videos and read multiple articles, and I have yet to hear anyone mention the obvious issue of “eye glow” (front projection). They will talk about the social acceptance of them looking like glasses and being able to see the person’s eyes, but then they won’t mention the glaring problem of the person’s eyes glowing. It stuck out to me because they didn’t mention the eye glow issue, evident in all the videos and many photos.
Eye glow is an issue that diffractive waveguide designers have been trying to reduce/eliminate for years. Then there are Lumus reflective waveguides with inherently little eye glow. Vuzix, Digilens, and Dispelix make big points about how they have reduced the problem with diffractive waveguides (see Front Projection (“Eye Glow”) and Pantoscoptic Tilt to Eliminate “Eye Glow”). However, these diffractive waveguide designs with greatly reduced eye glow issues have relatively small (25-35 degree) FOVs. The Orion design supports a very wide 70-degree FOV while trying to make it fit the size of a “typical” (if bulky) glasses frame; I suspect that the design methods to meet the size and FOV requirements meant that the issue of “eye glow” could not be addressed.
The transmissivity seems to vary in the many images and videos of people wearing Orions. It’s hard to tell, but it seems to change. On the right, two frames switch back and forth, and the glasses darken as the person puts them on (from video Orion AR Glasses: Apple’s Last Days)
Because I’m judging from videos and pictures with uncontrolled lighting, it’s impossible to know the transmissivity, but I can compare it to other AR glasses. Below are the highly transmissive Lumus Maximus glasses with greater than 80% transmissivity and the Hololens 2 with ~40% compared to the two dimming levels of the Orion glasses.
Below is a still frame from a Meta video showing some of the individual parts of the Orion glasses. They appear to show unusually dark cover glass, a dimming shutter (possibly liquid crystal) with a drive circuit attached, and a stack of flat optics with the waveguide with electronics connected to it. In his video, Norm Chen stated, “My understanding is the frontmost layer can be like a polarized layer.” This seems consistent with what appears to be the cover “glass” (which could be plastic), which looks so dark compared to the dimming shutter (LC is nearly transparent as it only changes the polarization of light).
If it does use a polarization-based dimming structure, this will cause problems when viewing polarization-based displays (such as LCD-based computer monitors and smartphones).
Orion’s Unusual Diffractive Waveguides
Axel Wong‘s analysis of Meta Orion’s Waveguide, which was translated and published on Reddit as Meta Orion AR Glasses: The first DEEP DIVE into the optical architecture, served as a starting point for my study of the Meta Orions optics, and I largely agree with his findings. Based on the figures he showed, his analysis was based on Meta Platforms’ (a patent holding company of Meta) US patent application 2024/0179284. Three figures from that application are shown below.
[10-08-2024 – Corrected the order of the Red, Green, and Blue inputs in Fig 10 below]
Overlapping Diffraction Gratings
It appears that Orion uses waveguides with diffraction gratings on both sides of the substrate (see FIG. 12A above). In Figure 10, the first and second “output gratings” overlap, which suggests that these gratings are on different surfaces. Based on FIGs 12A and 7C above, the gratings are on opposite sides of the same substrate. I have not seen this before with other waveguides and suspect it is a complicated/expensive process.
As Alex Wong pointed out in his analysis, supporting such a wide FOV in a glass form factor necessitated that the two large gratings overlap. Below (upper-left) is shown the Hololens 1 waveguide, typical of most other diffractive waveguides. It consists of a small input grating, a (often) trapezoidal-shaped expansion grating, and a more rectangular second expansion and output/exit grating. In the Orion (upper right), the two larger gratings effectively overlap so that the waveguide fits in the eyeglasses form factor. I have roughly positioned the Hololens 1 and Orion waveguides at the same vertical location relative to the eye.
Also shown in the figure above (lower left) is Orion’s waveguide wafer, which I used to generate the outlines of the gratings, and a picture (lower right) showing the two diffraction gratings in the eye glow from Orion.
It should be noted that while the Hololens 1 has only about half the FOV of the Orion, the size of the exit gratings is similar. The size of the Hololens 1 exit grating is due to the Hololen 1 having enough eye relief to support most wearing glasses. The farther away the eye is from the grating, the bigger the grating needs to be for a given FOV.
Light Entering From the “wrong side” of the waveguide
The patent application figures 12A and 7C are curious because the projector is on the opposite side of the waveguide from the eye/output. This would suggest that the projectors are outside the glasses rather than hidden in the temples on the same side of the waveguide as the eye.
Meta’s Bosworth in The WILDEST Tech I’ve Ever Tried – Meta Orion at 9:55 stated, “And so, this stack right here [pointing to the corner of the glasses of the clear plastic prototype] gets much thinner, actually, about half as thick. ‘Cause the protector comes in from the back at that point.”
Based on Bosworth’s statement, some optics route the light from the projectors in the temples to the front of the waveguides, necessitating thicker frames. Bosworth said that the next generation’s waveguides will accept light from the rear side of the waveguide. I assume that making the waveguides work this way is more difficult, or they would have already done it rather than having thicker frames on Orion.
However, Bosworth said, “There’s no bubbles. Like you throw this thing in a fish tank, you’re not gonna see anything.” This implies that everything is densely packed into the glasses, so other than saving the volume of the extra optics, there may not be a major size reduction possible. (Bosworth referenced Steve Jobs Dropping an iPod prototype in water story to prove that it could be made smaller due to the air bubbles that escaped)
Disparity Correction (Shown in Patent Application but not in Orion)
Meta’s application 2024/0179284, while showing many other details of the waveguide, is directed to “disparity correction.” Bosworth discusses in several interviews (including here) that Orion does not have disparity correction but that they intend to put it in future designs. As Bosworth describes it, the disparity correction is intended to correct for any flexing of the frames (or other alignment issues) that would cause the waveguides (and their images relative to the eyes) to move. He seems to suggest that this would allow Meta to use frames that would be thinner and that might have some flex to them.
Half Circular Entrance Gratings
Wong, in the Reddit article, also noticed that small input/entrance gratings visible on the wafer looked to be cut-off circles and commented:
However, if the coupling grating is indeed half-moon shaped, the light spot output by the light engine is also likely to be this shape. I personally guess that this design is mainly to reduce a common problem with SRG at the coupling point, that is, the secondary diffraction of the coupled light by the coupling grating.
Before the light spot of the light engine embarks on the great journey of total reflection and then entering the human eye after entering the coupling grating, a considerable part of the light will unfortunately be diffracted directly out by hitting the coupling grating again. This part of the light will cause a great energy loss, and it is also possible to hit the glass surface of the screen and then return to the grating to form ghost images.
Single Waveguide for all three colors?
The patent application seems to suggest that there is a single (double-sided) waveguide for all three colors (red, green, and blue). Most larger FOV full-color diffractive AR glasses will stack three (red, green, and blue—Examples Hololens One and Magic Leap 1&2) or two waveguides (red+blue and blue+green—Example Hololens 2). Dispelix has single-layer, full-color diffractive waveguides that go up to 50 degrees FOV.
Diffraction gratings have a line spacing based on the wavelengths of light they are meant to diffract. Supporting full color with such a wide FOV in a single waveguide would typically cause issues with image quality, including light fall-off in some colors and contrast losses. Unfortunately, there are no “through the optics” pictures or even subjective evaluations by an independent expert as to the image quality of Orion.
Silicon Carbide Waveguide Substrate
The idea of using silicon carbide for Waveguides it not unique to Meta. Below is an image from GETTING THE BIG PICTURE IN AR/VR, which discusses the advantages of using high-index materials likeLithium Niobate and Silicon Carbide to make waveguides. It is well known that going to a higher index of refraction substrates supports wider FOVs, as shown in the figure below. The problem, as Bosworth points out, is that growing silicon carbide wafers are very expensive. The wafers are also much smaller, enabling fewer waveguides per wafer. From the pictures of Meta’s wafers, they only get four waveguides per wafer, whereas there can be a dozen or more diffractive waveguides made on larger and much less expensive glass wafers.
Bosworth says “Nearly Artifact Free” and with Low “Rainbow” capture
A common issue with diffractive waveguides is that the diffraction gratings will capture light in the real world and then spread it out by wavelength like a prism, which creates a rainbow-like effect.
In Adam Savage’s Tested interview (@~5:10), Bosworth said, “The waveguide itself is nano etched into silicon carbide, which is a novel material with a super high index of refraction, which allows us to minimize the Lost photons and minimize the number of photons we capture from the world, so it minimizes things like ghosting and Haze and rainbow all these artifacts while giving you that field of view that you want. Well it’s not artifact free, it’s very close to artifact-free.” I appreciate that while Bosworth tried to give the advantages of their waveguide technology, he immediately corrected himself when he had overstated his case (unlike Hololens’ Kipman as cited in the Introduction). I would feel even better if they let some independent experts study it and give their opinions.
What Bosworth says about rainbows and other diffractive artifacts may be true, but I would like to see it evaluated by independent experts. Norm said in the same video, “It was a very on-rails demo with many guard rails. They walked me through this very evenly diffused lit room, so no bright lights.” I appreciate that Norm recognized he was getting at least a bit of a “magic show” demo (see appendix).
Wild Enthusiasm Stage and Lack of Technical Reviews
In the words of Yogi Berra, “It’s like deja vu all over again.” We went through this with the Apple Vision Pro, which went from being the second coming of the smartphone to almost disappearing earlier this year. This time, a more limited group of media people has been given access. There is virtually no critical analysis of the display’s image quality or the effect on the real world. I may be skeptical, but I have seen dozens of different diffractive waveguide designs, and there must be some issues, yet nothing has been reported. I’m expecting there to be problems with color uniformity and diffraction artifacts, but nothing was mentioned.
Strange Mix of a Wide FOV and Low Resolution
There was also little to no discussion in the reviews of Orion’s very low angular resolution of only 13 pixels per degree (PPD) spread over a 70-degree FOV (a topic for my next article on Orion). This works to about a 720- by 540-pixel display resolution.
Several people reported seeing a 26PPD demo, but it was unclear if this was a form factor or a lab-bench demo. Even 26PPD is a fairly low angular resolution.
Optical versus Passthough AR – Orion vs Vision Pro
Meta’s Orion demonstration is a declaration that optical AR (e.g., Orion) and non-camera passthrough AR, such as Apple Vision Pro, are the long-term prize devices. It makes the point that no passthrough camera and display combination can come close to competing with the real-world view in terms of dynamic range, resolution, biocular stereo, and infinite numbers of focus depths.
As I have repeatedly pointed out in writing and presentations, optical AR prioritizes the view of the real world, while camera passthrough AR prioritizes the virtual image view. I think there is very little overlap in their applications. I can’t imagine anyone allowing someone out on a factor floor or onto the streets of a city in a future Apple Vision Pro type device, but one could imagine it with something like the Meta Orion. And I think this is the point that Meta wanted to make.
Conclusions
I understand that Meta was demonstrating, in a way, “If money was not an obstacle, what could we do?” I think they were too fixated on the very wide FOV issue. I am concerned that the diffractive Silicon Carbide waveguides are not the right solution in the near or long term. They certainly can’t have a volume/consumer product with a significant “eye glow” problem.
This is a subject I have discussed many times, including in Small FOV Optical AR Discussion with Thad Starner and FOV Obsession. They have the worst of all worlds in some ways, with a very large FOV and a relatively low-resolution display; they block most of the real world for a given amount of content. With the same money, I think they could have made a more impressive demo with exotic waveguide materials that didn’t seem so far off in the future. I intend to get more into the human factors and display utility in this series on Meta Orion.
Appendix: “Demos Are a Magic Show”
Seeing the way Meta introduced Orion and hearing of the crafted demos they gave reminded me of one of my earliest blog articles from 2012 call Cynics Guide to CES – Glossary of Terms which gave warning about seeing demos.
Escaped From the Lab
Orion seems to fit the definition of an “escape from the lab.” Quoting from the 2012 article:
“Escaped from the lab” – This is the demonstration of a product concept that is highly impractical for any of a number of reasons including cost, lifetime/reliability, size, unrealistic setting (for example requires a special room that few could afford), and dangerous without skilled supervision. Sometimes demos “escape from the lab” because a company’s management has sunk a lot of money into a project and a public demo is an attempt to prove to management that the concepts will at least one day appeal to consumers.
Why make such a big deal about Orion, a prototype with a strange mix of features and impractically expensive components? Someone(s) is trying to prove that the product concept was worth continued investment.
Magic Show
I also warned that demos are “a magic show.”
A Wizard of Oz (visual) – Carefully controlling the lighting, image size, viewing location and/or visual content in order to hide what would be obvious defects. Sometimes you are seeing a “magic show” that has little relationship to real world use.
I constantly try and remind people that “demos are a magic show.” Most people get wowed by the show or being one of the special people to try on a new device. Many in the media may be great at writing, but they are not experts on evaluating displays. The imperfections and problems go unnoticed in a well-crafted demo with someone that is not trained to “look behind the curtain.”
The demo content is often picked to best show off a device and avoid content that might show flaws. For example, content that is busy with lots of visual “noise” will hide problems like image uniformity and dead pixels. Usually, the toughest test patterns are the simplest, as one will immediately be able to tell if something is wrong. I typically like patterns with a mostly white screen to check for uniformity and a mostly black screen to check for contrast, with some details in the patterns to show resolution and some large spots to check for unwanted reflections. For example, see my test patterns, which are free to download. When trying on a headset that supports a web browser, I will navigate to my test pattern page and select one of the test patterns.
Most of the companies that are getting early devices will have a special relationship with the manufacturer. They have a vested interest in seeing that the product succeeds either for their internal program or because they hope to develop software for the device. They certainly won’t want to be seen as causing Microsoft problems. They tend to direct their negative opinions to the manufacturer, not public forums.
Only with independent testing by people with display experience using their own test content will we understand the image quality of the Hololens 2.
Worldwide celebrations demonstrate the ways thousands of IEEE members in local communities join together to collaborate on ideas that leverage technology for a better tomorrow.
Celebrate IEEE Day with colleagues from IEEE Sections, Student Branches, Affinity groups, and Society Chapters. Events happen both virtually and in person all around the world.
Join the celebration around the world!
Every year, IEEE members from IEEE Sections, Student Branches, Affinity groups, and Society Chapters join hands to celebrate IEEE Day. Events happen both virtually and in person. IEEE Day celebrates the first time in history when engineers worldwide gathered to share their technical ideas in 1884.
Have some fun and compete in the photo and video contests. Get your phone and camera ready when you attend one of the events. This year we will have both Photo and Video Contests. You can submit your entries in STEM, technical, humanitarian and social categories.
Imagine if your boss called a meeting in May to announce that he’s committing 10 percent of the company’s revenue to the development of a brand-new mass-market consumer product, made with a not-yet-ready-for-mass-production component. Oh, and he wants it on store shelves in less than six months, in time for the holiday shopping season. Ambitious, yes. Kind of nuts, also yes.
But that’s pretty much what Pat Haggerty, vice president of Texas Instruments, did in 1954. The result was the
Regency TR-1, the world’s first commercial transistor radio, which debuted 70 years ago this month. The engineers delivered on Haggerty’s audacious goal, and I certainly hope they received a substantial year-end bonus.
Why did Texas Instruments make the Regency TR-1 transistor radio?
But how did Texas Instruments come to make a transistor radio in the first place? TI traces its roots to a company called Geophysical Service Inc. (GSI), which made seismic instrumentation for the oil industry as well as electronics for the military. In 1945, GSI hired
Patrick E. Haggerty as the general manager of its laboratory and manufacturing division and its electronics work. By 1951, Haggerty’s division was significantly outpacing GSI’s geophysical division, and so the Dallas-based company reorganized as Texas Instruments to focus on electronics.
Meanwhile, on 30 June 1948, Bell Labs announced John Bardeen and Walter Brattain’s
game-changing invention of the transistor. No longer would electronics be dependent on large, hot vacuum tubes. The U.S. government chose not to classify the technology because of its potentially broad applications. In 1951, Bell Labs began licensing the transistor for US $25,000 through the Western Electric Co.; Haggerty bought a license for TI the following year.
The engineers delivered on Haggerty’s audacious goal, and I certainly hope they received a substantial year-end bonus.
TI was still a small company, with not much in the way of R&D capacity. But Haggerty and the other founders wanted it to become a big and profitable company. And so they established research labs to focus on semiconductor materials and a project-engineering group to develop marketable products.
The TR-1 was the first transistor radio, and it ignited a desire for portable gadgets that continues to this day.
Bettmann/Getty Images
Haggerty made a good investment when he hired
Gordon Teal, a 22-year veteran of Bell Labs. Although Teal wasn’t part of the team that invented the germanium transistor, he recognized that it could be improved by using a single grown crystal, such as silicon. Haggerty was familiar with Teal’s work from a 1951 Bell Labs symposium on transistor technology. Teal happened to be homesick for his native Texas, so when TI advertised for a research director in the New York Times, he applied, and Haggerty offered him the job of assistant vice president instead. Teal started at TI on 1 January 1953.
Fifteen months later, Teal gave Haggerty a demonstration of the first silicon transistor, and he presented his findings three and a half weeks later at the Institute of Radio Engineers’ National Conference on Airborne Electronics, in Dayton, Ohio. His innocuously titled paper, “Some Recent Developments in Silicon and Germanium Materials and Devices,” completely understated the magnitude of the announcement. The audience was astounded to hear that TI had not just one but three types of silicon transistors already in production, as Michael Riordan recounts in his excellent article “The Lost History of the Transistor” (IEEE Spectrum, October 2004).
And fun fact: The TR-1 shown at top once belonged to Willis Adcock, a physical chemist hired by Teal to perfect TI’s silicon transistors as well as transistors for the TR-1. (The radio is now in the collections of the Smithsonian’s National Museum of American History.)
The TR-1 became a product in less than six months
This advancement in silicon put TI on the map as a major player in the transistor industry, but Haggerty was impatient. He wanted a transistorized commercial product
now, even if that meant using germanium transistors. On 21 May 1954, Haggerty challenged a research group at TI to have a working prototype of a transistor radio by the following week; four days later, the team came through, with a breadboard containing eight transistors. Haggerty decided that was good enough to commit $2 million—just under 10 percent of TI’s revenue—to commercializing the radio.
Of course, a working prototype is not the same as a mass-production product, and Haggerty knew TI needed a partner to help manufacture the radio. That partner turned out to be Industrial Development Engineering Associates (IDEA), a small company out of Indianapolis that specialized in antenna boosters and other electronic goods. They signed an agreement in June 1954 with the goal of announcing the new radio in October. TI would provide the components, and IDEA would manufacture the radio under its Regency brand.
Germanium transistors at the time cost $10 to $15 apiece. With eight transistors, the radio was too expensive to be marketed at the desired price point of $50 (more than $580 today, which is coincidentally about what it’ll cost you to buy one in good condition on eBay). Vacuum-tube radios were selling for less, but TI and IDEA figured early adopters would pay that much to try out a new technology. Part of Haggerty’s strategy was to increase the volume of transistor production to eventually lower the per-transistor cost, which he managed to push down to about $2.50.
By the time TI met with IDEA, the breadboard was down to six transistors. It was IDEA’s challenge to figure out how to make the transistorized radio at a profit. According to an
oral history with Richard Koch, IDEA’s chief engineer on the project, TI’s real goal was to make transistors, and the radio was simply the gimmick to get there. In fact, part of the TI–IDEA agreement was that any patents that came out of the project would be in the public domain so that TI was free to sell more transistors to other buyers.
At the initial meeting, Koch, who had never seen a transistor before in real life, suggested substituting a germanium diode for the detector (which extracted the audio signal from the desired radio frequency), bringing the transistor count down to five. After thinking about the configuration a bit more, Koch eliminated another transistor by using a single transistor for the oscillator/mixer circuit.
TI’s original prototype used eight germanium transistors, which engineers reduced to six and, ultimately, four for the production model.Division of Work and Industry/National Museum of American History/Smithsonian Institution
The final design was four transistors set in a
superheterodyne design, a type of receiver that combines two frequencies to produce an intermediate frequency that can be easily amplified, thereby boosting a weak signal and decreasing the required antenna size. The TR-1 had two transistors as intermediate-frequency amplifiers and one as an audio amplifier, plus the oscillator/mixer. Koch applied for a patent for the circuitry the following year.
The radio ran on a 22.5-volt battery, which offered a playing life of 20 to 30 hours and cost about $1.25. (Such batteries were also used in the external power and electronics pack for hearing aids, the only other consumer product to use transistors up until this point.)
While IDEA’s team was working on the circuitry, they outsourced the
design of the TR-1’s packaging to the Chicago firm of Painter, Teague, and Petertil. Their first design didn’t work because the components didn’t fit. Would their second design be better? As Koch later recalled, IDEA’s purchasing agent, Floyd Hayhurst, picked up the molding dies for the radio cases in Chicago and rushed them back to Indianapolis. He arrived at 2:00 in the morning, and the team got to work. Fortunately, everything fit this time. The plastic case was a little warped, but that was simple to fix: They slapped a wooden piece on each case as it came off the line so it wouldn’t twist as it cooled.
This video shows how each radio was assembled by hand:
On 18 October 1954, Texas Instruments announced the first commercial transistorized radio. It would be available in select outlets in New York and Los Angeles beginning 1 November, with wider distribution once production ramped up. The Regency TR-1 Transistor Pocket Radio initially came in black, gray, red, and ivory. They later added green and mahogany, as well as a run of pearlescents and translucents: lavender, pearl white, meridian blue, powder pink, and lime.
The TR-1 got so-so reviews, faced competition
Consumer Reports was not enthusiastic about the Regency TR-1. In its April 1955 review, it found that transmission of speech was “adequate” under good conditions, but music transmission was unsatisfactory under any conditions, especially on a noisy street or crowded beach. The magazine used adjectives such as whistle, squeal, thin, tinny, and high-pitched to describe various sounds—not exactly high praise for a radio. It also found fault with the on/off switch. Their recommendation: Wait for further refinement before buying one.
More than 100,000 TR-1s were sold in its first year, but the radio was never very profitable.Archive PL/Alamy
The engineers at TI and IDEA didn’t necessarily disagree. They knew they were making a sound-quality trade-off by going with just four transistors. They also had quality-control problems with the transistors and other components, with initial failure rates up to 50 percent. Eventually, IDEA got the failure rate down to 12 to 15 percent.
Unbeknownst to TI or IDEA, Raytheon was also working on a transistorized radio—a tabletop model rather than a pocket-size one. That gave them the space to use six transistors, which significantly upped the sound quality. Raytheon’s radio came out in February 1955. Priced at $79.95, it weighed 2 kilograms and ran on four D-cell batteries. That August, a small Japanese company called Tokyo Telecommunications Engineering Corp. released its first transistor radio, the TR-55. A few years later, the company changed its name to Sony and went on to dominate the world’s consumer radio market.
The legacy of the Regency TR-1
The Regency TR-1 was a success by many measures: It sold 100,000 in its first year, and it helped jump-start the transistor market. But the radio was never very profitable. Within a few years, both Texas Instruments and IDEA left the commercial AM radio business, TI to focus on semiconductors, and IDEA to concentrate on citizens band radios. Yet Pat Haggerty estimated that this little pocket radio pushed the market in transistorized consumer goods ahead by two years. It was a leap of faith that worked out, thanks to some hardworking engineers with a vision.
Part of a continuing series looking at historical artifacts that embrace the boundless potential of technology.
An abridged version of this article appears in the October 2024 print issue as “The First Transistor Radio.”
References
In 1984, Michael Wolff conducted oral histories with IDEA’s lead engineer Richard Koch and purchasing agent Floyd Hayhurst. Wolff subsequently used them the following year in his IEEE Spectrum article “The Secret Six-Month Project,” which includes some great references at the end.
Robert J. Simcoe wrote “The Revolution in Your Pocket” for the fall 2004 issue of Invention and Technology to commemorate the 50th anniversary of the Regency TR-1.
As with many collectibles, the Regency TR-1 has its champions who have gathered together many primary sources. For example, Steve Reyer, a professor of electrical engineering at the Milwaukee School of Engineering before he passed away in 2018, organized his efforts in a webpage that’s now hosted by https://www.collectornet.net.
Four decades after the first IEEE International Conference on Robotics and Automation (ICRA) in Atlanta, robotics is bigger than ever. Next week in Rotterdam is the IEEE ICRA@40 conference, “a celebration of 40 years of pioneering research and technological advancements in robotics and automation.” There’s an ICRA every year, of course. Arguably the largest robotics research conference in the world, the 2024 edition was held in Yokohama, Japan back in May.
ICRA@40 is not just a second ICRA conference in 2024. Next week’s conference is a single track that promises “a journey through the evolution of robotics and automation,” through four days of short keynotes from prominent roboticists from across the entire field. You can see for yourself, the speaker list is nuts. There are also debates and panels tackling big ideas, like: “What progress has been made in different areas of robotics and automation over the past decades, and what key challenges remain?” Personally, I’d say “lots” and “most of them,” but that’s probably why I’m not going to be up on stage.
Forty years ago is a long time, but it’s not that long, so just for fun, I had a look at the proceedings of ICRA 1984 which are available on IEEE Xplore, if you’re curious. Here’s an excerpt of the forward from the organizers, which included folks from International Business Machines and Bell Labs:
The proceedings of the first IEEE Computer Society International Conference on Robotics contains papers covering practically all aspects of robotics. The response to our call for papers has been overwhelming, and the number of papers submitted by authors outside the United States indicates the strong international interest in robotics. The Conference program includes papers on: computer vision; touch and other local sensing; manipulator kinematics, dynamics, control and simulation; robot programming languages, operating systems, representation, planning, man-machine interfaces; multiple and mobile robot systems. The technical level of the Conference is high with papers being presented by leading researchers in robotics. We believe that this conference, the first of a series to be sponsored by the IEEE, will provide a forum for the dissemination of fundamental research results in this fast developing field.
Technically, this was “ICR,” not “ICRA,” and it was put on by the IEEE Computer Society’s Technical Committee on Robotics, since there was no IEEE Robotics and Automation Society at that time; RAS didn’t get off the ground until 1987.
1984 ICR(A) had two tracks, and featured about 75 papers presented over three days. Looking through the proceedings, you’ll find lots of familiar names: Harry Asada, Ruzena Bajcsy, Ken Salisbury, Paolo Dario, Matt Mason, Toshio Fukuda, Ron Fearing, and Marc Raibert. Many of these folks will be at ICRA@40, so if you see them, make sure and thank them for helping to start it all, because 40 years of robotics is definitely something to celebrate.