Reading view

There are new articles available, click to refresh the page.

Preparing for the unknown: A guide to future-proofing imaging IT

In an era of unprecedented technological advancement, the health-care industry stands at a crossroad. As health expenditure continues to outpace GDP in many countries, health-care executives grapple with crucial decisions on investment prioritization for digitization, innovation, and digital transformation. The imperative to provide high-quality, patient-centric care in an increasingly digital world has never been more pressing. At the forefront of this transformation is imaging IT—a critical component that’s evolving to meet the challenges of modern health care.

The future of imaging IT is characterized by interconnected systems, advanced analytics, robust data security, AI-driven enhancements, and agile infrastructure. Organizations that embrace these trends will be well-positioned to thrive in the changing health-care landscape. But what exactly does this future look like, and how can health-care providers prepare for it?

Networked care models: The new paradigm

The adoption of networked care models is set to revolutionize health-care delivery. These models foster collaboration among stakeholders, making patient information readily available and leading to more personalized and efficient care. As we move forward, expect to see health-care organizations increasingly investing in technologies that enable seamless data sharing and interoperability.

Imagine a scenario where a patient’s entire medical history, including imaging data from various specialists, is instantly accessible to any authorized health-care provider. This level of connectivity not only improves diagnosis and treatment but also enhances the overall patient experience.

Data integration and analytics: Unlocking insights

True data integration is becoming the norm in health care. Robust integrated image and data management solutions (IDM) are consolidating patient data from diverse sources. But the real game-changer lies in the application of advanced analytics and AI to this treasure trove of information.

By leveraging these technologies, medical professionals can extract meaningful insights from complex data sets, leading to quicker and more accurate diagnoses and treatment decisions. The potential for improving patient outcomes through data-driven decision-making is immense.

A case in point is the implementation of Syngo Carbon Image and Data Management (IDM) at Tirol Kliniken GmbH in Innsbruck, Austria. This solution consolidates all patient-centric data points in one place, including different image and photo formats, DICOM CDs, and digitalized video sources from endoscopy or microscopy. The system digitizes all documents in their raw formats, enabling the distribution of native, actionable data throughout the enterprise.

Data privacy and edge computing: Balancing innovation and security

As health care becomes increasingly data-driven, concerns about data privacy remain paramount. Enter edge computing—a solution that enables the processing of sensitive patient data locally, reducing the risk of data breaches during processing and transmission.

This approach is crucial for health-care facilities aiming to maintain patient trust while adopting advanced technologies. By keeping data processing close to the source, health-care providers can leverage cutting-edge analytics without compromising on security.

Workflow integration and AI: Enhancing efficiency and accuracy

The integration of AI into medical imaging workflows is set to dramatically improve efficiency, accuracy, and the overall quality of patient care. AI-powered solutions are becoming increasingly common, reducing the burden of repetitive tasks and speeding up diagnosis.

From automated image analysis to predictive modeling, AI is transforming every aspect of the imaging workflow. This not only improves operational efficiency but also allows health-care professionals to focus more on patient care and complex cases that require human expertise.

A quantitative analysis at the Medical University of South Carolina demonstrates the impact of AI integration. With the support of deep learning algorithms fully embedded in the clinical workflow, cardiothoracic radiologists exhibited a reduction in chest CT interpretation times of 22.1% compared to workflows without AI support.

Virtualization: The key to agility

To future-proof their IT infrastructure, health-care organizations are turning to virtualization. This approach allows for modularization and flexibility, making it easier to adapt to rapidly evolving technologies such as AI-driven diagnostics.

Container technology is playing a pivotal role in optimizing resource utilization and scalability. By embracing virtualization, health-care providers can ensure their IT systems remain agile and responsive to changing needs.

Standardization and compliance: Ensuring long-term compatibility

As imaging IT systems evolve, adherence to industry standards and compliance requirements remains crucial. These systems need to seamlessly interact with Electronic Health Records (EHRs), medical devices, and other critical systems.

This adherence ensures long-term compatibility and the ability to accommodate emerging technologies. It also facilitates smoother integration of new solutions into existing IT ecosystems, reducing implementation challenges and costs.

Real-world success stories

The benefits of these technologies are not theoretical—they are being realized in health-care organizations around the world. For instance, the virtualization strategy implemented at University Hospital Essen (UME), one of Germany’s largest university hospitals, has dramatically improved the hospital’s ability to manage increasing data volumes and applications. UME’s critical clinical information systems now run on modular and virtualized systems, allowing experts to design and use innovative solutions, including AI tools that automate tasks previously done manually by IT and medical staff.

Similarly, the PANCAIM project leverages edge computing for pancreatic cancer detection. This EU-funded initiative uses Siemens Healthineers’ edge computing approach to develop and validate AI algorithms. At Karolinska Institutet, Sweden, an algorithm was implemented for a real pancreatic cancer case, ensuring sensitive patient data remains within the hospital while advancing AI validation in clinical settings.

Another innovative approach is the concept of a Common Patient Data Model (CPDM). This standardized framework defines how patient data is organized, stored, and exchanged across different health-care systems and platforms, addressing interoperability challenges in the current health-care landscape.

The road ahead: Continuous innovation

As we look to the future, it’s clear that technological advancements in radiology will continue at a rapid pace. To stay competitive and provide the best patient care, health-care organizations must prioritize ongoing innovation and the adoption of new technologies.

This includes not only IT systems but also medical devices and treatment methodologies. The health-care providers who embrace this ethos of continuous improvement will be best positioned to navigate the challenges and opportunities that lie ahead.

In conclusion, the future of imaging IT is bright, promising unprecedented levels of efficiency, accuracy, and patient-centricity. By embracing networked care models, leveraging advanced analytics and AI, prioritizing data security, and maintaining agile IT infrastructure, health-care organizations can ensure they’re prepared for whatever the future may hold.

The journey towards future-proof imaging IT may seem daunting, but it’s a necessary evolution in our quest to provide the best possible health care. As we stand on the brink of this new era, one thing is clear: the future of health care is digital, data-driven, and more connected than ever before.

If you want to learn more, you can find more information from Siemens Healthineers.

Syngo Carbon consists of several products which are (medical) devices in their own right. Some products are under development and not commercially available. Future availability cannot be ensured.

The results by Siemens Healthineers customers described herein are based on results that were achieved in the customer’s unique setting. Since there is no “typical” hospital and many variables exist (e.g., hospital size, case mix, level of IT adoption), it cannot be guaranteed that other customers will achieve the same results.

This content was produced by Siemens Healthineers. It was not written by MIT Technology Review’s editorial staff.

Readying business for the age of AI

Rapid advancements in AI technology offer unprecedented opportunities to enhance business operations, customer and employee engagement, and decision-making. Executives are eager to see the potential of AI realized. Among 100 c-suite respondents polled in WNS Analytics’ “The Future of Enterprise Data & AI” report, 76% say they are already implementing or planning to implement generative AI solutions. Among those same leaders, however, 67% report struggling with data migration, and others cite grappling with data quality, talent shortages, and data democratization issues. 

MIT Technology Review Insights recently had a conversation with Alex Sidgreaves, chief data officer at Zurich Insurance; Bogdan Szostek, chief data officer at Animal Friends Insurance; Shan Lodh, director of data platforms at Shawbrook Bank; and Gautam Singh, head of data, analytics, and AI at WNS Analytics, to discuss how enterprises can navigate the burgeoning era of AI.

AI across industries

There is no shortage of AI use cases across sectors. Retailers are tailoring shopping experiences to individual preferences by leveraging customer behavior data and advanced machine learning models. Traditional AI models can deliver personalized offerings. However, with generative AI, these personalized offerings are elevated by incorporating tailored communication that considers the customer’s persona, behavior, and past interactions. In insurance, by leveraging generative AI, companies can identify subrogation recovery opportunities that a manual handler might overlook, enhancing efficiency and maximizing recovery potential. Banking and financial services institutions are leveraging AI to bolster customer due diligence and enhance anti-money laundering efforts by leveraging AI-driven credit risk management practices. AI technologies are enhancing diagnostic accuracy through sophisticated image recognition in radiology, allowing for earlier and more precise detection of diseases while predictive analytics enable personalized treatment plans.

The core of successful AI implementation lies in understanding its business value, building a robust data foundation, aligning with the strategic goals of the organization, and infusing skilled expertise across every level of an enterprise.

  • “I think we should also be asking ourselves, if we do succeed, what are we going to stop doing? Because when we empower colleagues through AI, we are giving them new capabilities [and] faster, quicker, leaner ways of doing things. So we need to be true to even thinking about the org design. Oftentimes, an AI program doesn’t work, not because the technology doesn’t work, but the downstream business processes or the organizational structures are still kept as before.” Shan Lodh, director of data platforms, Shawbrook Bank

Whether automating routine tasks, enhancing customer experiences, or providing deeper insights through data analysis, it’s essential to define what AI can do for an enterprise in specific terms. AI’s popularity and broad promises are not good enough reasons to jump headfirst into enterprise-wide adoption. 

“AI projects should come from a value-led position rather than being led by technology,” says Sidgreaves. “The key is to always ensure you know what value you’re bringing to the business or to the customer with the AI. And actually always ask yourself the question, do we even need AI to solve that problem?”

Having a good technology partner is crucial to ensure that value is realized. Gautam Singh, head of data, analytics, and AI at WNS, says, “At WNS Analytics, we keep clients’ organizational goals at the center. We have focused and strengthened around core productized services that go deep in generating value for our clients.” Singh explains their approach, “We do this by leveraging our unique AI and human interaction approach to develop custom services and deliver differentiated outcomes.”

The foundation of any advanced technology adoption is data and AI is no exception. Singh explains, “Advanced technologies like AI and generative AI may not always be the right choice, and hence we work with our clients to understand the need, to develop the right solution for each situation.” With increasingly large and complex data volumes, effectively managing and modernizing data infrastructure is essential to provide the basis for AI tools. 

This means breaking down silos and maximizing AI’s impact involves regular communication and collaboration across departments, from marketing teams working with data scientists to understand customer behavior patterns to IT teams ensuring their infrastructure supports AI initiatives. 

  • “I would emphasize the growing customer’s expectations in terms of what they expect our businesses to offer them and to provide us a quality and speed of service. At Animal Friends, we see the generative AI potential to be the biggest with sophisticated chatbots and voice bots that can serve our customers 24/7 and deliver the right level of service, and being cost effective for our customers. Bogdan Szostek, chief data officer, Animal Friends

Investing in domain experts with insight into the regulations, operations, and industry practices is just as necessary in the success of deploying AI systems as the right data foundations and strategy. Continuous training and upskilling are essential to keep pace with evolving AI technologies.

Ensuring AI trust and transparency

Creating trust in generative AI implementation requires the same mechanisms employed for all emerging technologies: accountability, security, and ethical standards. Being transparent about how AI systems are used, the data they rely on, and the decision-making processes they employ can go a long way in forging trust among stakeholders. In fact, The Future of Enterprise Data & AI report cites 55% of organizations identify “building trust in AI systems among stakeholders” as the biggest challenge when scaling AI initiatives. 

“We need talent, we need communication, we need the ethical framework, we need very good data, and so on,” says Lodh. “Those things don’t really go away. In fact, they become even more necessary for generative AI, but of course the usages are more varied.” 

AI should augment human decision-making and business workflows. Guardrails with human oversight ensure that enterprise teams have access to AI tools but are in control of high-risk and high-value decisions.

“Bias in AI can creep in from almost anywhere and will do so unless you’re extremely careful. Challenges come into three buckets. You’ve got privacy challenges, data quality, completeness challenges, and then really training AI systems on data that’s biased, which is easily done,” says Sidgreaves. She emphasizes it is vital to ensure that data is up-to-date, accurate, and clean. High-quality data enhances the reliability and performance of AI models. Regular audits and data quality checks can help maintain the integrity of data.

An agile approach to AI implementation

ROI is always top of mind for business leaders looking to cash in on the promised potential of AI systems. As technology continues to evolve rapidly and the potential use cases of AI grow, starting small, creating measurable benchmarks, and adopting an agile approach can ensure success in scaling solutions. By starting with pilot projects and scaling successful initiatives, companies can manage risks and optimize resources. Sidgreaves, Szostek, and Lodh stress that while it may be tempting to throw everything at the wall and see what sticks, accessing the greatest returns from expanding AI tools means remaining flexible, strategic, and iterative. 

In insurance, two areas where AI has a significant ROI impact are risk and operational efficiency. Sidgreaves underscores that reducing manual processes is essential for large, heritage organizations, and generative AI and large language models (LLMs) are revolutionizing this aspect by significantly diminishing the need for manual activities.

To illustrate her point, she cites a specific example: “Consider the task of reviewing and drafting policy wording. Traditionally, this process would take an individual up to four weeks. However, with LLMs, this same task can now be completed in a matter of seconds.”  

Lodh adds that establishing ROI at the project’s onset and implementing cross-functional metrics are crucial for capturing a comprehensive view of a project’s impact. For instance, using LLMs for writing code is a great example of how IT and information security teams can collaborate. By assessing the quality of static code analysis generated by LLMs, these teams can ensure that the code meets security and performance standards.

“It’s very hard because technology is changing so quickly,” says Szostek. “We need to truly apply an agile approach, do not try to prescribe all the elements of the future deliveries in 12, 18, 24 months. We have to test and learn and iterate, and also fail fast if that’s needed.” 

Navigating the future of the AI era 

The rapid evolution of the digital age continues to bring immense opportunities for enterprises globally, from the c-suite to the factory floor. With no shortage of use cases and promises to boost efficiencies, drive innovation, and improve customer and employee experiences, few business leaders dismiss the proliferation of AI as mere hype. However, the successful and responsible implementation of AI requires a careful balance of strategy, transparency, and robust data privacy and security measures.

  • “It’s really easy as technology people to be driven by the next core thing, but we would have to be solving a business problem. So the key is to always ensure you know what value you’re bringing to the business or to the customer with the AI. And actually always ask yourself the question, do we even need AI to solve that problem?” — Alex Sidgreaves, chief data officer, Zurich Insurance

Fully harnessing the power of AI while maintaining trust means defining clear business values, ensuring accountability, managing data privacy, balancing innovation with ethical use, and staying ahead of future trends. Enterprises must remain vigilant and adaptable, committed to ethical practices and an agile approach to thrive in this rapidly changing business landscape.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Sydney’s Tech Super-Cluster Propels Australia’s AI Industry Forward



This is a sponsored article brought to you by BESydney.

Australia has experienced a remarkable surge in AI enterprise during the past decade. Significant AI research and commercialization concentrated in Sydney drives the sector’s development nationwide and influences AI trends globally. The city’s cutting-edge AI sector sees academia, business and government converge to foster groundbreaking advancements, positioning Australia as a key player on the international stage.

Sydney – home to half of Australia’s AI companies

Sydney has been pinpointed as one of four urban super-clusters in Australia, featuring the highest number of tech firms and the most substantial research in the country.

The Geography of Australia’s Digital Industries report, commissioned by the National Science Agency, the Commonwealth Scientific and Industrial Research Organisation (CSIRO) and the Tech Council of Australia, found Sydney is home to 119,636 digital professionals and 81 digital technology companies listed on the Australian Stock Exchange with a combined worth of A$52 billion.

AI is infusing all areas of this tech landscape. According to CSIRO, more than 200 active AI companies operate across Greater Sydney, representing almost half of the country’s 544 AI companies.

“Sydney is the capital of AI startups for Australia and this part of Australasia”
—Toby Walsh, UNSW Sydney

With this extensive AI commercialization and collaboration in progress across Sydney, AI startups are flourishing.

“Sydney is the capital of AI startups for Australia and this part of Australasia,” according to Professor Toby Walsh, Scientia Professor of Artificial Intelligence at the Department of Computer Science and Engineering at the University of New South Wales (UNSW Sydney).

He cites robotics, AI in medicine and fintech as three areas where Sydney leads the world in AI innovation.

“As a whole, Australia punches well above its weight in the AI sector,” Professor Walsh says. “We’re easily in the top 10, and by some metrics, we’re in the top five in the world. For a country of just 25 million people, that is quite remarkable.”

Sydney’s universities at the forefront of AI research

A key to Sydney’s success in the sector is the strength of its universities, which are producing outstanding research.

In 2021, the University of Sydney (USYD), the University of New South Wales (UNSW Sydney), and the University of Technology Sydney (UTS) collectively produced more than 1000 peer-reviewed publications in artificial intelligence, contributing significantly to the field’s development.

According to CSIRO, Australia’s research and development sector has higher rates of AI adoption than global averages, with Sydney presenting the highest AI publishing intensity among Australian universities and research institutes.

Professor Aaron Quigley, Science Director and Deputy Director of CSIRO’s Data61 and Head of School in Computer Science and Engineering at UNSW Sydney, says Sydney’s AI prowess is supported by a robust educational pipeline that supplies skilled graduates to a wide range of industries that are rapidly adopting AI technologies.

“Sydney’s AI sector is backed up by the fact that you have such a large educational environment with universities like UTS, USYD and UNSW Sydney,” he says. “They rank in the top five of AI locations in Australia.”

UNSW Sydney is a heavy hitter, with more than 300 researchers applying AI across various critical fields such as hydrogen fuel catalysis, coastal monitoring, safe mining, medical diagnostics, epidemiology and stress management.

A photo of a smiling man next to a device.  UNSW Sydney has more than 300 researchers applying AI across various critical fields such as hydrogen fuel catalysis, coastal monitoring, safe mining, medical diagnostics, epidemiology, and stress management.UNSW

UNSW Sydney’s AI Institute also has the largest concentration of academics working in AI in the country, adds Professor Walsh.

“One of the main reasons the AI Institute exists at UNSW Sydney is to be a front door to industry and government, to help translate the technology out of the laboratory and into practice,” he says.

Likewise, the Sydney Artificial Intelligence Centre at the University of Sydney, the Australian Artificial Intelligence Institute at UTS, and Macquarie University’s Centre for Applied Artificial Intelligence are producing world-leading research in collaboration with industry.

Alongside the universities, the Australian Government’s National AI Centre in Sydney, aims to support and accelerate Australia’s AI industry.

Synergies in Sydney: where tech titans converge

Sydney’s vortex of tech talent has meant exciting connections and collaborations are happening at lightning speed, allowing simultaneous growth of several high-value industries.

The intersection between quantum computing and AI will come into focus with the April 2024 announcement of a new Australian Centre for Quantum Growth at the University of Sydney. This centre will aim to build strategic and lasting relationships that drive innovation to increase the nation’s competitiveness within the field. Funded under the Australian Government’s National Quantum Strategy, it aims to promote the industry and enhance Australia’s global standing.

“There’s nowhere else in the world that you’re going to get a quantum company, a games company, and a cybersecurity company in such close proximity across this super-cluster arc located in Sydney”
—Aaron Quigley, UNSW Sydney

“There’s a huge amount of experience in the quantum space in Sydney,” says Professor Quigley. “Then you have a large number of companies and researchers working in cybersecurity, so you have the cybersecurity-AI nexus as well. Then you’ve got a large number of media companies and gaming companies in Sydney, so you’ve got the interconnection between gaming and creative technologies and AI.”

“So it’s a confluence of different industry spaces, and if you come here, you can tap into these different specialisms,” he adds “There’s nowhere else in the world that you’re going to get a quantum company, a games company, and a cybersecurity company in such close proximity across this super-cluster arc located in Sydney.”

A global hub for AI innovation and collaboration

In addition to its research and industry achievements in the AI sector, Sydney is also a leading destination for AI conferences and events. The annual Women in AI Asia Pacific Conference is held in Sydney each year, adding much-needed diversity to the mix.

Additionally, the prestigious International Joint Conference on Artificial Intelligence was held in Sydney in 1991.

Overall, Sydney’s integrated approach to AI development, characterized by strong academic output, supportive government policies, and vibrant commercial activity, firmly establishes it as a leader in the global AI landscape.

To discover more about how Sydney is shaping the future of AI download the latest eBook on Sydney’s Science & Engineering industry at besydney.com.au

The rise of the data platform for hybrid cloud

Whether pursuing digital transformation, exploring the potential of AI, or simply looking to simplify and optimize existing IT infrastructure, today’s organizations must do this in the context of increasingly complex multi-cloud environments. These complicated architectures are here to stay—2023 research by Enterprise Strategy Group, for example, found that 87% of organizations expect their applications to be distributed across still more locations in the next two years.

Scott Sinclair, practice director at Enterprise Strategy Group, outlines the problem: “Data is becoming more distributed. Apps are becoming more distributed. The typical organization has multiple data centers, multiple cloud providers, and umpteen edge locations. Data is all over the place and continues to be created at a very rapid rate.”

Finding a way to unify this disparate data is essential. In doing so, organizations must balance the explosive growth of enterprise data; the need for an on-premises, cloud-like consumption model to mitigate cyberattack risks; and continual pressure to cut costs and improve performance.

Sinclair summarizes: “What you want is something that can sit on top of this distributed data ecosystem and present something that is intuitive and consistent that I can use to leverage the data in the most impactful way, the most beneficial way to my business.”

For many, the solution is an overarching software-defined, virtualized data platform that delivers a common data plane and control plane across hybrid cloud environments. Ian Clatworthy, head of data platform product marketing at Hitachi Vantara, describes a data platform as “an integrated set of technologies that meets an organization’s data needs, enabling storage and delivery of data, the governance of data, and the security of data for a business.”

Gartner projects that these consolidated data storage platforms will constitute 70% of file and object storage by 2028, doubling from 35% in 2023. The research firm underscores that “Infrastructure and operations leaders must prioritize storage platforms to stay ahead of business demands.”

A transitional moment for enterprise data

Historically, organizations have stored their various types of data—file, block, object—in separate silos. Why change now? Because two main drivers are rendering traditional data storage schemes inadequate for today’s business needs: digital transformation and AI.

As digital transformation initiatives accelerate, organizations are discovering that having distinct storage solutions for each workload is inadequate for their escalating data volumes and changing business landscapes. The complexity of the modern data estate hinders many efforts toward change.

Clatworthy says that when organizations move to hybrid cloud environments, they may find, for example, that they have mainframe or data center data stored in one silo, block storage running on an appliance, apps running file storage, another silo for public cloud, and a separate VMware stack. The result is increased complexity and
cost in their IT infrastructure, as well as reduced flexibility and efficiency.

Then, Clatworthy adds, “When we get to the world of generative AI that’s bubbling around the edges, and we’re going to have this mass explosion of data, we need to simplify how that data is managed so that applications can consume it. That’s where a platform comes in.”

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Advancing to adaptive cloud

For many years now, cloud solutions have helped organizations streamline their operations, increase their scalability, and reduce costs. Yet, enterprise cloud investment has been fragmented, often lacking a coherent organization-wide approach. In fact, it’s not uncommon for various teams across an organization to have spun up their own cloud projects, adopting a wide variety of cloud strategies and providers, from public and hybrid to multi-cloud and edge computing.

The problem with this approach is that it often leads to “a sprawling set of systems and disparate teams working on these cloud systems, making it difficult to keep up with the pace of innovation,” says Bernardo Caldas, corporate vice president of Azure Edge product management at Microsoft. In addition to being an IT headache, a fragmented cloud environment leads to technological and organizational repercussions.

A complex multi-cloud deployment can make it difficult for IT teams to perform mission-critical tasks, such as applying security patches, meeting regulatory requirements, managing costs, and accessing data for data analytics. Configuring and securing these types of environments is a challenging and time-consuming task. And ad hoc cloud deployments often culminate in systems incompatibility when one-off pilots are ready to scale or be combined with existing products.

Without a common IT operations and application development platform, teams can’t share lessons learned or pool important resources, which tends to cause them to become increasingly siloed. “People want to do more with their data, but if their data is trapped and isolated in these different systems, it can make it really hard to tap into the data for insights and to accelerate progress,” says Caldas.

As the pace of change accelerates, however, many organizations are adopting a new adaptive cloud approach—one that will enable them to respond quickly to evolving consumer demands and market fluctuations while simplifying the management of their complex cloud environments.

An adaptive strategy for success

Heralding a departure from yesteryear’s fragmented cloud environments, an adaptive cloud approach unites sprawling systems, disparate silos, and distributed sites into a single operations, development, security, application, and data model. This unified approach empowers organizations to glean value from cloud-native technologies, open source software such as Linux, and AI across hybrid, multi-cloud, edge, and IoT.

“You’ve got a lot of legacy software out there, and for the most part, you don’t want to change production environments,” says David Harmon, director of software engineering at AMD. “Nobody wants to change code. So while CTOs and developers really want to take advantage of all the hardware changes, they want to do nothing to their code base if possible, because that change is very, very expensive.”

An adaptive cloud approach answers this challenge by taking an agnostic approach to the environments it brings together on a single control plane. By seamlessly collecting disparate computing environments, including those that run outside of hyperscale data centers, the control plane creates greater visibility across thousands of assets, simplifies security enforcement, and allows for easier management.

An adaptive cloud approach enables unified management of disparate systems and resources, leading to improved oversight and control. An adaptive approach also creates scalability, as it allows organizations to meet the fluctuating demands of a business without the risk of over-provisioning or under-provisioning resources.

There are also clear business advantages to embracing an adaptive cloud approach. Consider, for example, an operational technology team that deploys an automation system to accelerate a factory’s production capabilities. In a fragmented and distributed environment, systems often struggle to communicate. But in an adaptive cloud environment, a factory’s automation system can easily be connected to the organization’s customer relationship management system, providing sales teams with real-time insights into supply-demand fluctuations.

A united platform is not only capable of bringing together disparate systems but also of connecting employees from across functions, from sales to engineering. By sharing an interconnected web of cloud-native tools, a workforce’s collective skills and knowledge can be applied to initiatives across the organization—a valuable asset in today’s resource-strapped and talent-scarce business climate.

Using cloud-native technologies like Kubernetes and microservices can also expedite the development of applications across various environments, regardless of an application’s purpose. For example, IT teams can scale applications from massive cloud platforms to on-site production without complex rewrites. Together, these capabilities “propel innovation, simplify complexity, and enhance the ability to respond to business opportunities,” says Caldas.

The AI equation

From automating mundane processes to optimizing operations, AI is revolutionizing the way businesses work. In fact, the market for AI reached $184 billion in 2024—a staggering increase from nearly $50 billion in 2023, and it is expected to surpass $826 billion in 2030.

But AI applications and models require high-quality data to generate high-quality outputs. That’s a challenging feat when data sets are trapped in silos across distributed environments. Fortunately, an adaptive cloud approach can provide a unified data platform for AI initiatives.

“An adaptive cloud approach consolidates data from various locations in a way that’s more useful for companies and creates a robust foundation for AI applications,” says Caldas. “It creates a unified data platform that ensures that companies’ AI tools have access to high-quality data to make decisions.”

Another benefit of an adaptive cloud approach is the ability to tap into the capabilities of innovative tools such as Microsoft Copilot in Azure. Copilot in Azure is an AI companion that simplifies how IT teams operate and troubleshoot apps and infrastructure. By leveraging large language models to interact with an organization’s data, Copilot allows for deeper exploration and intelligent assessment of systems within a unified management framework.

Imagine, for example, the task of troubleshooting the root cause of a system anomaly. Typically, IT teams must sift through thousands of logs, exchanging a series of emails with colleagues, and reading documentation for answers. Copilot in Azure, however, can cut through this complexity by easing anomaly detection of unanticipated system changes while, at the same time, providing recommendations for speedy resolution.

“Organizations can now interact with systems using chat capabilities, ask questions about environments, and gain real insights into what’s happening across the heterogenous environments,” says Caldas.

An adaptive approach for the technology future

Today’s technology environments are only increasing in complexity. More systems, more data, more applications—together, they form a massive sprawling infrastructure. But proactively reacting to change, be it in market trends or customer needs, requires greater agility and integration across the organization. The answer: an adaptive approach. A unified platform for IT operations and management, applications, data, and security can consolidate the disparate parts of a fragmented environment in ways that not only ease IT management and application development but also deliver key business benefits, from faster time to market to AI efficiencies, at a time when organizations must move swiftly to succeed.

Microsoft Azure and AMD meet you where you are on your cloud journey. Learn more about an adaptive cloud approach with Azure.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

A playbook for crafting AI strategy

Giddy predictions about AI, from its contributions to economic growth to the onset of mass automation, are now as frequent as the release of powerful new generative AI models. The consultancy PwC, for example, predicts that AI could boost global gross domestic product (GDP) 14% by 2030, generating US $15.7 trillion.

Forty percent of our mundane tasks could be automated by then, claim researchers at the University of Oxford, while Goldman Sachs forecasts US $200 billion in AI investment by 2025. “No job, no function will remain untouched by AI,” says SP Singh, senior vice president and global head, enterprise application integration and services, at technology company Infosys.

While these prognostications may prove true, today’s businesses are finding major hurdles when they seek to graduate from pilots and experiments to enterprise-wide AI deployment. Just 5.4% of US businesses, for example, were using AI to produce a product or service in 2024.

Moving from initial forays into AI use, such as code generation and customer service, to firm-wide integration depends on strategic and organizational transitions in infrastructure, data governance, and supplier ecosystems. As well, organizations must weigh uncertainties about developments in AI performance and how to measure return on investment.

If organizations seek to scale AI across the business in coming years, however, now is the time to act. This report explores the current state of enterprise AI adoption and offers a playbook for crafting an AI strategy, helping business leaders bridge the chasm between ambition and execution. Key findings include the following:

AI ambitions are substantial, but few have scaled beyond pilots. Fully 95% of companies surveyed are already using AI and 99% expect to in the future. But few organizations have graduated beyond pilot projects: 76% have deployed AI in just one to three use cases. But because half of companies expect to fully deploy AI across all business functions within two years, this year is key to establishing foundations for enterprise-wide AI.

AI readiness spending is slated to rise significantly. Overall, AI spending in 2022 and 2023 was modest or flat for most companies, with only one in four increasing their spending by more than a quarter. That is set to change in 2024, with nine in ten respondents expecting to increase AI spending on data readiness (including platform modernization, cloud migration, and data quality) and in adjacent areas like strategy, cultural change, and business models. Four in ten expect to increase spending by 10 to 24%, and one-third expect to increase spending by 25 to 49%.

Data liquidity is one of the most important attributes for AI deployment. The ability to seamlessly access, combine, and analyze data from various sources enables firms to extract relevant information and apply it effectively to specific business scenarios. It also eliminates the need to sift through vast data repositories, as the data is already curated and tailored to the task at hand.

Data quality is a major limitation for AI deployment. Half of respondents cite data quality as the most limiting data issue in deployment. This is especially true for larger firms with more data and substantial investments in legacy IT infrastructure. Companies with revenues of over US $10 billion are the most likely to cite both data quality and data infrastructure as limiters, suggesting that organizations presiding over larger data repositories find the problem substantially harder.

Companies are not rushing into AI. Nearly all organizations (98%) say they are willing to forgo being the first to use AI if that ensures they deliver it safely and securely. Governance, security, and privacy are the biggest brake on the speed of AI deployment, cited by 45% of respondents (and a full 65% of respondents from the largest companies).

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Reimagining cloud strategy for AI-first enterprises

The rise of generative artificial intelligence (AI), natural language processing, and computer vision has sparked lofty predictions: AI will revolutionize business operations, transform the nature of knowledge work, and boost companies’ bottom lines and the larger global economy by trillions of dollars.

Executives and technology leaders are eager to see these promises realized, and many are enjoying impressive results of early AI investments. Balakrishna D.R. (Bali), executive vice president, global services head, AI and industry verticals at Infosys, says that generative AI is already proving game-changing for tasks such as knowledge management, search and summarization, software development, and customer service across sectors such as financial services, retail, health care, and automotive.

Realizing AI’s full potential on a mass scale will require more than just executives’ enthusiasm; becoming a truly AI-first enterprise will require a significant, sustained investment in cloud infrastructure and strategy. In 2024, the cloud has evolved beyond its initial purpose as a storage tool and cost saver to become a crucial driver of innovation, transformation, and disruption. Now, with AI in the mix, enterprises are looking to the cloud to support large language models (LLMs) to maximize R&D performance and prevent cybersecurity attacks, among other high-impact use cases.

A 2023 report by Infosys looks at how prepared companies are to realize the combined potential of cloud and AI. To further assess this state of readiness, MIT Technology Review Insights and Infosys surveyed 500 business leaders across industries such as IT, manufacturing, financial services, and consumer goods about how their organizations are thinking about—and acting upon—an integrated cloud and AI strategy.

This research found that most companies are still experimenting and preparing their infrastructure landscape for AI from a cloud perspective—and many are planning additional investments to accelerate their progress.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Your Gateway to a Vibrant Career in the Expanding Semiconductor Industry



This sponsored article is brought to you by Purdue University.

The CHIPS America Act was a response to a worsening shortfall in engineers equipped to meet the growing demand for advanced electronic devices. That need persists. In its 2023 policy report, Chipping Away: Assessing and Addressing the Labor Market Gap Facing the U.S. Semiconductor Industry, the Semiconductor Industry Association forecast a demand for 69,000 microelectronic and semiconductor engineers between 2023 and 2030—including 28,900 new positions created by industry expansion and 40,100 openings to replace engineers who retire or leave the field.

This number does not include another 34,500 computer scientists (13,200 new jobs, 21,300 replacements), nor does it count jobs in other industries that require advanced or custom-designed semiconductors for controls, automation, communication, product design, and the emerging systems-of-systems technology ecosystem.

Purdue University is taking charge, leading semiconductor technology and workforce development in the U.S. As early as Spring 2022, Purdue University became the first top engineering school to offer an online Master’s Degree in Microelectronics and Semiconductors.

U.S. News & World Report has ranked the university’s graduate engineering program among America’s 10 best every year since 2012 (and among the top 4 since 2022)

“The degree was developed as part of Purdue’s overall semiconductor degrees program,” says Purdue Prof. Vijay Raghunathan, one of the architects of the semiconductor program. “It was what I would describe as the nation’s most ambitious semiconductor workforce development effort.”

A person dressed in a dark suit with a white shirt and red tie poses for a professional portrait against a dark background. Prof. Vijay Raghunathan, one of the architects of the online Master’s Degree in Microelectronics and Semiconductors at Purdue.Purdue University

Purdue built and announced its bold high-technology online program while the U.S. Congress was still debating the $53 billion “Creating Helpful Incentives to Produce Semiconductors for America Act” (CHIPS America Act), which would be passed in July 2022 and signed into law in August.

Today, the online Master’s in Microelectronics and Semiconductors is well underway. Students learn leading-edge equipment and software and prepare to meet the challenges they will face in a rejuvenated, and critical, U.S. semiconductor industry.

Is the drive for semiconductor education succeeding?

“I think we have conclusively established that the answer is a resounding ‘Yes,’” says Raghunathan. Like understanding big data, or being able to program, “the ability to understand how semiconductors and semiconductor-based systems work, even at a rudimentary level, is something that everybody should know. Virtually any product you design or make is going to have chips inside it. You need to understand how they work, what the significance is, and what the risks are.”

Earning a Master’s in Microelectronics and Semiconductors

Students pursuing the Master’s Degree in Microelectronics and Semiconductors will take courses in circuit design, devices and engineering, systems design, and supply chain management offered by several schools in the university, such as Purdue’s Mitch Daniels School of Business, the Purdue Polytechnic Institute, the Elmore Family School of Electrical and Computer Engineering, and the School of Materials Engineering, among others.

Professionals can also take one-credit-hour courses, which are intended to help students build “breadth at the edges,” a notion that grew out of feedback from employers: Tomorrow’s engineering leaders will need broad knowledge to connect with other specialties in the increasingly interdisciplinary world of artificial intelligence, robotics, and the Internet of Things.

“This was something that we embarked on as an experiment 5 or 6 years ago,” says Raghunathan of the one-credit courses. “I think, in hindsight, that it’s turned out spectacularly.”

A researcher wearing a white lab coat, hairnet, and gloves works with scientific equipment, with a computer monitor displaying a detailed scientific pattern. A researcher adjusts imaging equipment in a lab in Birck Nanotechnology Center, home to Purdue’s advanced research and development on semiconductors and other technology at the atomic scale.Rebecca Robiños/Purdue University

The Semiconductor Engineering Education Leader

Purdue, which opened its first classes in 1874, is today an acknowledged leader in engineering education. U.S. News & World Report has ranked the university’s graduate engineering program among America’s 10 best every year since 2012 (and among the top 4 since 2022). And Purdue’s online graduate engineering program has ranked in the country’s top three since the publication started evaluating online grad programs in 2020. (Purdue has offered distance Master’s degrees since the 1980s. Back then, of course, course lectures were videotaped and mailed to students. With the growth of the web, “distance” became “online,” and the program has swelled.)

Thus, Microelectronics and Semiconductors Master’s Degree candidates can study online or on-campus. Both tracks take the same courses from the same instructors and earn the same degree. There are no footnotes, asterisks, or parentheses on the diploma to denote online or in-person study.

“If you look at our program, it will become clear why Purdue is increasingly considered America’s leading semiconductors university” —Prof. Vijay Raghunathan, Purdue University

Students take classes at their own pace, using an integrated suite of proven online-learning applications for attending lectures, submitting homework, taking tests, and communicating with faculty and one another. Texts may be purchased or downloaded from the school library. And there is frequent use of modeling and analytical tools like Matlab. In addition, Purdue is also the home of national the national design-computing resources nanoHUB.org (with hundreds of modeling, simulation, teaching, and software-development tools) and its offspring, chipshub.org (specializing in tools for chip design and fabrication).

From R&D to Workforce and Economic Development

“If you look at our program, it will become clear why Purdue is increasingly considered America’s leading semiconductors university, because this is such a strategic priority for the entire university, from our President all the way down,” Prof. Raghunathan sums up. “We have a task force that reports directly to the President, a task force focused only on semiconductors and microelectronics. On all aspects—R&D, the innovation pipeline, workforce development, economic development to bring companies to the state. We’re all in as far as chips are concerned.”

Why a Technical Master’s Degree Can Accelerate Your Engineering Career



This sponsored article is brought to you by Purdue University.

Companies large and small are seeking engineers with up-to-date, subject-specific knowledge in disciplines like computer engineering, automation, artificial intelligence, and circuit design. Mid-level engineers need to advance their skillsets to apply and integrate these technologies and be competitive.


As applications for new technologies continue to grow, demand for knowledgeable electrical and computer engineers is also on the rise. According to the Bureau of Labor Statistics, job outlook for electrical and electronics engineers—as well as computer hardware engineers—is set to grow 5 percent through 2032. Electrical and computer engineers work in almost every industry. They design systems, work on power transmission and power supplies, run computers and communication systems, innovate chips for embedded and so much more.

To take advantage of this job growth and get more return-on-investment, engineers are advancing their knowledge by going back to school. The 2023 IEEE-USA Salary and Benefits Survey Report shows that engineers with focused master’s degrees (e.g., electrical and computer engineering, electrical engineering, or computer engineering) earned median salaries almost US $27,000 per year higher than their colleagues with bachelors’ degrees alone.


Purdue’s online MSECE program has been ranked in the top 3 of U.S. News and World Report’s Best Online Electrical Engineering Master’s Programs for five years running


Universities like Purdue University work with companies and professionals to provide upskilling opportunities via distance and online education. Purdue has offered a distance Master of Science in Electrical and Computer Engineering (MSECE) since the 1980s. In its early years, the program’s course lectures were videotaped and mailed to students. Now, “distance” has transformed into “online,” and the program has grown with the web, expanding its size and scope. Today, the online MSECE has awarded master’s degrees to 190+ online students since the Fall 2021 semester.


A person with shoulder-length brown hair is wearing a black blazer over a dark blouse. They have a silver necklace with a pendant. The background consists of a brick wall.


“Purdue has a long-standing reputation of engineering excellence and Purdue engineers work worldwide in every company, including General Motors, Northrop Grumman, Raytheon, Texas Instruments, Apple, and Sandia National Laboratories among scores of others,” said Lynn Hegewald, the senior program manager for Purdue’s online MSECE. “Employers everywhere are very aware of Purdue graduates’ capabilities and the quality of the education they bring to the job.”


Today, the online MSECE program continues to select from among the world’s best professionals and gives them an affordable, award-winning education. The program has been ranked in the top 3 of U.S. News and World Report’s Best Online Electrical Engineering Master’s Programs for five years running (2020, 2021, 2022, 2023, and 2024).


The online MSECE offers high-quality research and technical skills, high-level analytical thinking and problem-solving skills, and new ideas to help innovate—all highly sought-after, according to one of the few studies to systematically inventory what engineering employers want (information corroborated on occupational guidance websites like O-Net and the Bureau of Labor Statistics).

Remote students get the same education as on-campus students and become part of the same alumni network.

“Our online MSECE program offers the same exceptional quality as our on-campus offerings to students around the country and the globe,” says Prof. Milind Kulkarni, Michael and Katherine Birck Head of the Elmore Family School of Electrical and Computer Engineering. “Online students take the same classes, with the same professors, as on-campus students; they work on the same assignments and even collaborate on group projects.


“Our online MSECE program offers the same exceptional quality as our on-campus offerings to students around the country and the globe” —Prof. Milind Kulkarni, Purdue University


“We’re very proud,” he adds, “that we’re able to make a ‘full-strength’ Purdue ECE degree available to so many people, whether they’re working full-time across the country, live abroad, or serve in the military. And the results bear this out: graduates of our program land jobs at top global companies, move on to new roles and responsibilities at their current organizations, or even continue to pursue graduate education at top PhD programs.”


A person wearing a dark blazer over a light blue, patterned shirt is smiling at the camera and standing indoors with a modern background featuring large windows and wooden panels.


Variety and Quality in Purdue’s MSECE

As they study for their MSECE degrees, online students can select from among a hundred graduate-level courses in their primary areas of interest, including innovative one-credit-hour courses that extend the students’ knowledge. New courses and new areas of interest are always in the pipeline.

Purdue MSECE Area of Interest and Course Options


  • Automatic Control
  • Communications, Networking, Signal and Image Processing
  • Computer Engineering
  • Fields and Optics
  • Microelectronics and Nanotechnology
  • Power and Energy Systems
  • VLSI and Circuit Design
  • Semiconductors
  • Data Mining
  • Quantum Computing
  • IoT
  • Big Data


Heather Woods, a process engineer at Texas Instruments, was one of the first students to enroll and chose the microelectronics and nanotechnology focus area. She offers this advice: “Take advantage of the one credit-hour classes! They let you finish your degree faster while not taking six credit hours every semester.”


Completing an online MSECE from Purdue University also teaches students professional skills that employers value like motivation, efficient time-management, high-level analysis and problem-solving, and the ability to learn quickly and write effectively.

“Having an MSECE shows I have the dedication and knowledge to be able to solve problems in engineering,” said program alumnus Benjamin Francis, now an engineering manager at AkzoNobel. “As I continue in my career, this gives me an advantage over other engineers both in terms of professional advancement opportunity and a technical base to pull information from to face new challenges.”


Finding Tuition Assistance

Working engineers contemplating graduate school should contact their human resources departments and find out what their tuition-assistance options are. Does your company offer tuition assistance? What courses of study do they cover? Do they cap reimbursements by course, semester, etc.? Does your employer pay tuition directly, or will you pay out-of-pocket and apply for reimbursement?

Prospective U.S. students who are veterans or children of veterans should also check with the U.S. Department of Veterans Affairs to see if they qualify to for tuition or other assistance.


The MSECE Advantage

In sum, the online Master’s degree in Electrical and Computer Engineering from Purdue University does an extraordinary job giving students the tools they need to succeed in school and then in the workplace: developing the technical knowledge, the confidence, and the often-overlooked professional skills that will help them excel in their careers.

Quantum Leap: Sydney’s Leading Role in the Next Tech Wave



This is a sponsored article brought to you by BESydney.

Australia plays a crucial role in global scientific endeavours, with a significant contribution recognized and valued worldwide. Despite comprising only 0.3 percent of the world’s population, it has contributed over 4 percent of the world’s published research.

Renowned for collaboration, Australian scientists work across disciplines and with international counterparts to achieve impactful outcomes. Notably excelling in medical sciences, engineering, and biological sciences, Australia also has globally recognized expertise in astronomy, physics and computer science.

As the country’s innovation hub and leveraging its robust scientific infrastructure, world-class universities and vibrant ecosystem, Sydney is making its mark on this burgeoning industry.

The city’s commitment to quantum research and development is evidenced by its groundbreaking advancements and substantial government support, positioning it at the forefront of the quantum revolution.

Sydney’s blend of academic excellence, industry collaboration and strategic government initiatives is creating a fertile ground for cutting-edge quantum advancements.

Sydney’s quantum ecosystem

Sydney’s quantum industry is bolstered by the Sydney Quantum Academy (SQA), a collaboration between four top-tier universities: University of NSW Sydney (UNSW Sydney), the University of Sydney (USYD), University of Technology Sydney (UTS), and Macquarie University. SQA integrates over 100 experts, fostering a dynamic quantum research and development environment.

With strong government backing Sydney is poised for significant growth in quantum technology, with a projected A$2.2 billion industry value and 8,700 jobs by 2030. The SQA’s mission is to cultivate a quantum-literate workforce, support industry partnerships and accelerate the development of quantum technology.

Professor Hugh Durrant-Whyte, NSW Chief Scientist and Engineer, emphasizes Sydney’s unique position: “We’ve invested in quantum for 20 years, and we have some of the best people at the Quantum Academy in Sydney. This investment and talent pool make Sydney an ideal place for pioneering quantum research and attracting global talent.”

Key institutions and innovations

UNSW’s Centre of Excellence for Quantum Computation and Communication Technology is at the heart of Sydney’s quantum advancements. Led by Scientia Professor Michelle Simmons AO, the founder and CEO of Silicon Quantum Computing, this centre is pioneering efforts to develop the world’s first practical supercomputer. This team is at the vanguard of precision atomic electronics, pioneering the fabrication of devices in silicon that are pivotal for both conventional and quantum computing applications and they have created the narrowest conducting wires and the smallest precision transistors.

“We can now not only put atoms in place but can connect complete circuitry with atomic precision.” —Michelle Simmons, Silicon Quantum Computing

Simmons was named 2018 Australian of the Year and won the 2023 Prime Minister’s Prize for Science for her work in creating the new field of atomic electronics. She is an Australian Research Council Laureate Fellow, a Fellow of the Royal Society of London, the American Academy of Arts and Science, the American Association of the Advancement of Science, the UK Institute of Physics, the Australian Academy of Technology and Engineering and the Australian Academy of Science.

In response to her 2023 accolade, Simmons said: “Twenty years ago, the ability to manipulate individual atoms and put them where we want in a device architecture was unimaginable. We can now not only put atoms in place but can connect complete circuitry with atomic precision—a capability that was developed entirely in Australia.”

Standing in a modern research lab with glass walls and wooden lab benches, a man grasps a cylindrical object attached to a robot arm's gripper while a woman operates a control touch-interface tablet. The Design Futures Lab at UNSW in Sydney, Australia, is a hands-on teaching and research lab that aims to inspire exploration, innovation, and research into fabrication, emerging technologies, and design theories.UNSW

Government and industry support

In April 2024, the Australian Centre for Quantum Growth program, part of the National Quantum Strategy, provided a substantial four-year grant to support the quantum industry’s expansion in Australia. Managed by the University of Sydney, the initiative aims to establish a central hub that fosters industry growth, collaboration, and research coordination.

This centre will serve as a primary resource for the quantum sector, enhancing Australia’s global competitiveness by promoting industry-led solutions and advancing technology adoption both domestically and internationally. Additionally, the centre will emphasise ethical practices and security in the development and application of quantum technologies.

Additionally, Sydney hosts several leading quantum startups, such as Silicon Quantum Computing, Quantum Brilliance, Diraq and Q-CTRL, which focus on improving the performance and stability of quantum systems.

Educational excellence

Sydney’s universities are globally recognized for their contributions to quantum research. They nurture future quantum leaders, and their academic prowess attracts top talent and fosters a culture of innovation and collaboration.

Sydney hosts several leading quantum startups, such as Silicon Quantum Computing, Quantum Brilliance, Diraq, and Q-CTRL, which focus on improving the performance and stability of quantum systems.

The UNSW Sydney is, one of Sydney’s universities, ranked among the world’s top 20 universities, and boasts the largest concentration of academics working in AI and quantum technologies in Australia.

UNSW Sydney Professor Toby Walsh is Laureate Fellow and Scientia Professor of Artificial Intelligence at the Department of Computer Science and Engineering at the University of New South Wales. He explains the significance of this academic strength: “Our students and researchers are at the cutting edge of quantum science. The collaborative efforts within Sydney’s academic institutions are creating a powerhouse of innovation that is driving the global quantum agenda.”

Sydney’s strategic investments and collaborative efforts in quantum technology have propelled the city to the forefront of this transformative field. With its unique and vibrant ecosystem, a blend of world-leading institutions, globally respected talent and strong government and industry support, Sydney is well-positioned to lead the global quantum revolution for the benefit of all. For more information on Sydney’s science and engineering industries visit besydney.com.au.

Building supply chain resilience with AI

If the last five years have taught businesses with complex supply chains anything, it is that resilience is crucial. In the first three months of the covid-19 pandemic, for example, supply-chain leader Amazon grew its business 44%. Its investments in supply chain resilience allowed it to deliver when its competitors could not, says Sanjeev Maddila, worldwide head of supply chain solutions at Amazon Web Services (AWS), increasing its market share and driving profits up 220%. A resilient supply chain ensures that a company can meet its customers’ needs despite inevitable disruption.

Today, businesses of all sizes must deliver to their customers against a backdrop of supply chain disruptions, with technological changes, shifting labor pools, geopolitics, and climate change adding new complexity and risk at a global scale. To succeed, they need to build resilient supply chains: fully digital operations that prioritize customers and their needs while establishing a fast, reliable, and sustainable delivery network.

The Canadian fertilizer company Nutrien, for example, operates two dozen manufacturing and processing facilities spread across the globe and nearly 2,000 retail stores in the Americas and Australia. To collect underutilized data from its industrial operations, and gain greater visibility into its supply chain, the company relies on a combination of cloud technology and artificial intelligence/machine learning (AI/ML) capabilities.

“A digital supply chain connects us from grower to manufacturer, providing visibility throughout the value chain,” says Adam Lorenz, senior director for strategic fleet and indirect procurement at Nutrien. This visibility is critical when it comes to navigating the company’s supply chain challenges, which include seasonal demands, weather dependencies, manufacturing capabilities, and product availability. The company requires real-time visibility into its fleets, for example, to identify the location of assets, see where products are moving, and determine inventory requirements.

Currently, Nutrien can locate a fertilizer or nutrient tank in a grower’s field and determine what Nutrien products are in it. By achieving that “real-time visibility” into a tank’s location and a customer’s immediate needs, Lorenz says the company “can forecast where assets are from a fill-level perspective and plan accordingly.” In turn, Nutrien can respond immediately to emerging customer needs, increasing company revenue while enhancing customer satisfaction, improving inventory management, and optimizing supply chain operations.

“For us, it’s about starting with data creation and then adding a layer of AI on top to really drive recommendations,” says Lorenz. In addition to improving product visibility and asset utilization, Lorenz says that Nutrien plans to add AI capabilities to its collaboration platforms that will make it easier for less-tech-savvy customers to take advantage of self-service capabilities and automation that accelerates processes and improves compliance with complex policies.

To meet and exceed customer expectations with differentiated service, speed, and reliability, all companies need to similarly modernize their supply chain operations. The key to doing so—and to increasing organizational resilience and sustainability—will be applying AI/ML to their extensive operational data in the cloud.

Resilience as a business differentiator

Like Nutrien, a wide variety of organizations from across industries are discovering the competitive advantages of modernizing their supply chains. A pharmaceutical company that aggregates its supply chain data for greater end-to-end visibility, for example, can provide better product tracking for critically ill customers. A retail startup undergoing meteoric growth can host its workloads in the cloud to support sudden upticks in demand while minimizing operating costs. And a transportation company can achieve inbound supply chain savings by evaluating the total distance its fleet travels to reduce mileage costs and CO2 emissions.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Transforming the energy industry through disruptive innovation

In the rhythm of our fast-paced lives, most of us don’t stop to think about where electricity comes from or or how it powers homes, industries, and the technologies that connect people around the world. As populations and economies grow, energy demands are set to increase by 50% by 2050–challenging century-old energy systems to adapt with innovative and agile solutions. This comes at a time when climate change is making its presence felt more than ever; 2023 marked the warmest year since records began in 1850, crossing the 1.5 degrees global warming threshold. 

Nadège Petit of Schneider Electric confronts this challenge head-on, saying, “We have no choice but to change the way we produce, distribute, and consume energy, and do it sustainably to tackle both the energy and climate crises.” She explains further that digital technologies are key to navigating this path, and Schneider Electric’s AI-enabled IoT solutions can empower customers to take control of their energy use, enhancing efficiency and resiliency.

Petit acknowledges the complexity of crafting and implementing robust sustainability strategies. She highlights the importance of taking an incremental stepwise approach, and adopting open standards, to drive near-term impact while laying the foundation for long-term decarbonization goals. 

Because the energy landscape is evolving rapidly, it’s critical to not just keep pace but to anticipate and shape the future. Much like actively managing health through food and fitness regimes, energy habits need to be monitored as well. This can transform passive consumers to become energy prosumers–those that produce, consume, and manage energy. Petit’s vision is one where “buildings and homes generate their own energy from renewable sources, use what’s needed, and feed the excess back to the grid.”  

To catalyze this transformation, Petit underscores the power of collaboration and innovation. For example, Schneider Electric’s SE Ventures invests in startups to provide new perspectives and capabilities to accelerate sustainable energy solutions. 

“It’s all about striking a balance to ensure that our relationship with startups are mutually beneficial, knowing when to provide guidance and resources when they need it, but also when to step back and allow them to thrive independently,” says Petit. 

This episode of Business Lab is produced in partnership with Schneider Electric. 

Full transcript 

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. 

Our topic today is disruptive innovation in the energy industry and beyond. We use energy every day. It powers our homes, buildings, economies, and lifestyles, but where it came from or how our use affects the global energy ecosystem is changing, and our energy ecosystem needs to change with it.

 My guest is Nadège Petit, the chief innovation officer at Schneider Electric. 

This podcast is produced in partnership with Schneider Electric. 

Welcome, Nadège. 

Nadège Petit: Hi, everyone. Thank you for having me today. 

Laurel: Well, we’re glad you’re here. 

Let’s start off with a simple question to build that context around our conversation. What is Schneider Electric’s mission? And as the chief innovation officer leading its Innovation at the Edge team, what are some examples of what the team is working on right now? 

Nadège: Let me set up this scene a little bit here. In recent years, our world has been shaped by a series of significant disruptions. The pandemic has driven a sharp increase in the demand of digital tools and technologies, with a projected 6x growth in the number of IoT devices between 2020 and 2030, and a 140x growth in IP traffic between 2020 and 2040. 

Simultaneously, there has been a parallel acceleration in energy demands. Electrical consumption has been increasing by 5,000 terawatt hours every 10 years over the past two decades. This is set to double in the next 10 years and then quadruple by 2040 This is amplified by the most severe energy crisis that we are facing now since the 1970s. Over 80% of carbon emissions are coming from energy, so electrifying the world and decarbonizing [the] energy sector is a must. We cannot overlook the climate crisis while meeting these energy demands. In 2023, the global average temperature was the warmest on record since 1850, surpassing the 1.5 degrees global warming limit. So, we have no choice but to change the way we produce, distribute, and consume energy, and do it sustainably to tackle both the energy and climate crises. This gives us a rare opportunity to reimagine and create a clean energy future we want. 

Schneider Electric as an energy management and digital automation company, aims to be the digital partner for sustainability and efficiency for our customers. With end-to-end experience in the energy sector, we are uniquely positioned to help customers digitize, electrify, and deploy sustainable technologies to help them progress toward net-zero. 

As for my role, we know that innovation is pivotal to drive the energy transition. The Innovation at the Edge team leads the way in discovering, developing, and delivering disruptive technologies that will define a more digital, electric, and sustainable energy landscape. We function today as an innovation engine, bridging internal and external innovation, to introduce new solutions, services and businesses to the market. Ultimately, we are crafting the future businesses for Schneider Electric in this sector. And to do this, we nourish a culture that recognizes and celebrates innovation. We welcome new ideas, consider new perspectives inside and outside the organization, and seek out unusual combinations that can kindle revolutionary ideas. We like to think of ourselves as explorers and forces of change, looking for and solving new customer problems. So curiosity and daring to disrupt are in our DNA. And this is the true spirit of Innovation at the Edge at Schneider Electric. 

Laurel: And it’s clear that urgency certainly comes out, especially for enterprises. Because they’re trying to build strong sustainability strategies to not just reach those environmental, social, and governance, or ESG, goals and targets; but also to improve resiliency and efficiency. What’s the role of digital technologies when we think about this all together in enabling a more sustainable future? 

Nadège: We see a sustainable future, and our goal is to enable the shift to an all-electric and all-digital world. That kind of transition isn’t possible without digital technology. We see digital as a key enabler of sustainability and decarbonization. The technology is already available now, it’s a matter of acceleration and adoption of it. And all of us, we have a role to play here. 

At Schneider Electric, we have built a suite of solutions that enable customers to accelerate their sustainability journey. Our flagship suite of IoT-enabled solution infrastructure empowers customers to monitor energy, carbon, and resource usage; and enabling them to implement strategies for efficiency, optimization, and resiliency. We have seen remarkable success stories of clients leveraging our digital EcoStruxure solution in buildings, utilities, data centers, hospitality, healthcare, and more, all over the place. If I were to take one example, I can take the example of PG&E customer, a leading California utility that everybody knows; they are using our EcoStruxure distributed energy resources management system, we call it DERMS, to manage grid reliability more effectively, which is crucial in the face of extreme weather events impacting the grid and consumers.

Schneider has also built an extensive ecosystem of partners because we do need to do it at scale together to accelerate digital transformation for customers. We also invest in cutting-edge technologies that make need-based collaboration and co-innovation possible. It’s all about working together towards one common goal. Ultimately the companies that embrace digital transformation will be the ones that will thrive on disruption. 

Laurel: It’s clear that building a strong sustainability strategy and then following through on the implementation does take time, but addressing climate change requires immediate action. How does your team at Schneider Electric as a whole work to balance those long-term commitments and act with urgency in the short term? It sounds like that internal and external innovation opportunity really could play a role here. 

Nadège: Absolutely. You’re absolutely right. We already have many of the technologies that will take us to net-zero. For example, 70% of CO2 emissions can be removed with existing technologies. By deploying electrification and digital solutions, we can get to our net-zero goals much faster. We know it’s a gradual process and as you already discussed previously, we do need to accelerate the adoption of it. By taking an incremental stepwise approach, we can drive near-term impact while laying the foundation for long-term decarbonization goals. 

Building on the same example of PG&E, which I referenced earlier; through our collaboration, piece by piece progressively, we are building the backbone of a sustainable, digitized, and reliable energy future in California with the deployment of EcoStruxure DERMS. As grid reliability and flexibility become more important, DERMS enable us to keep pace with 21st-century grid demands as they evolve. 

Another critical component of moving fast is embracing open systems and platforms, creating an interoperable ecosystem. By adopting open standards, you empower a wide range of experts to collaborate together, including startups, large organizations, senior decision-makers, and those on the ground. This future-proof investment ensures flexible and scalable solutions, that avoids expensive upgrades in the future and obsolescence. That is why at Innovation at the Edge we’re creating a win-win partnership to push market adoption of the innovative technology available today, but laying the foundation of an even more innovative tomorrow. Innovation at the Edge today provides the space to nurture those ideas, collaborate together, iterate, learn, and grow at pace. 

Laurel: What’s your strategy for investing in, and then adopting those disruptive technologies and business models, especially when you’re trying to build that kind of innovation for tomorrow? 

Nadège: I strongly believe innovation is a key driver of the energy transition. It’s very hard to create the right conditions for consistent innovation, as we discuss short-term and long-term. I want to quote again the famous book from Clayton Christenson, The Innovator’s Dilemma, about how big organizations can get so good at what they are already doing that they struggle to adapt as the market changes. And we are in this dilemma. So we do need to stay ahead. Leaders need to grasp disruptive technology, put customers first, foster innovation, and tackle emerging challenges head on. The phrase “that’s no longer how we do it,” really resonates with me as I look at the role of innovation in the energy space. 

At Schneider, innovation is more than just a buzzword. It’s our strategy for navigating the energy transition. We are investing in truly new and disruptive ideas, tech, and business models, taking the risk and the challenge. We complement our current offering constantly, and we include the new prosumer business that we’re building, and this is pivotal to accelerate the energy transition. We foster open innovation through investment and incubation of cutting-edge technology in energy management, electrical mobility, industrial automation, cybersecurity, artificial intelligence, sustainability, and other topics that will help to go through this innovation. I also can quote some joint ventures that we are creating with partners like GreenStruxure or AlphaStruxure. Those are offering energy-as-a-service solutions, so a new business model enabling organizations to leverage existing technology to achieve decarbonization at scale. As an example, GreenStruxure is helping Bimbo Bakeries move closer to net-zero with micro-grid system at six of their locations. This will provide 20% of Bimbo Bakeries’ USA energy usage and save an estimate of 1,700 tons of CO2 emission per year. 

Laurel: Yeah, that’s certainly remarkable. Following up on that, how does Schneider Electric define prosumer and how does that audience actually fit into Schneider Electric’s strategy when you’re trying to develop these new models? 

Nadège: Prosumer is my favorite word. Let’s redefine it again. Everybody’s speaking of prosumer, but what is prosumer? Prosumer refers to consumers that are actively involved in energy management; producing and consuming their own energy using technologies like solar panels, EV chargers, EV batteries, and EV storage. This is all digitally enabled. So everybody now, the customers, industrial customers, want to understand their energy. So becoming a prosumer comes with perks like lower energy bills. Fantastic, right? Increase independence, clean energy use, and potential compensation from utility providers. It’s beneficial to all of us; it’s beneficial to our planet, it’s beneficial to the decarbonization of the world. Imagine a future where buildings and homes generate their own energy from renewable sources, use what’s needed, and feed the excess back to the grid. This is a fantastic opportunity, and the interest in this is massive. 

To give you some figures; in 2019 we saw 100 gigawatts of new solar PV capacities deployed globally, and by last year this number had nearly quadrupled. So transformation is happening now. Electric vehicles, as an example, their sales have been soaring too, with a projected 14 million sales by 2023, six times the 2019 number. These technologies are already making a dent in emissions and the energy crisis. 

However, the journey to become a prosumer is complex. It’s all about scale and adoption, and it involves challenges with asset integration, grid modernization, regulatory compliance. So we are all part of this ecosystem, and it takes a lot of leadership to make it happen. So at Innovation at the Edge, we’re creating an ecosystem of solutions to streamline the prosumer journey from education and management to purchasing, installation, management, and maintenance of these new distributed resources. What we are doing, we are bringing together internal innovations that we already have in-house at Schneider Electric, like micro-grid, EV charging solutions, battery storage, and more with external innovation from portfolio companies. I can quote companies like Qmerit, EnergySage, EV Connect, Uplight, and AutoGrid, and we deliver end-to-end solutions from grid to prosumer. 

I want to insist one more time, it’s very important to accelerate and to be part of this accelerated adoption. These efforts are not just about strengthening our business, they’re about simplifying the energy ecosystem and moving the industry toward greater sustainability. It’s a collaborative journey that’s shaping the future of energy, and I’m very excited about this. 

Laurel: Focusing on that kind of urgency, innovation in large companies can be hampered by bureaucracy and go slow. What are some best practices for innovation without all of those delays? 

Nadège: Schneider Electric, we are not strangers to innovation, specifically in the energy management and industrial automation space. But to really push the envelope, we look beyond our walls for fresh ideas and expertise. And this is where SE Ventures comes in. It’s our one-billion-euro venture capital fund, from which we make bold bets and bring disruptive ideas to life by supporting and investing in startups that complement our current offering and explore future business. So based in Silicon Valley, but with a global reach, SE Ventures leverages our market knowledge and customer proximity to drive near-term value and commercial relationships with our businesses, customers, and partners. 

We also focus on partnership and incubation. So through partnerships with startups, we accelerate time to market. We accelerate the R&D roadmap and explore new products, new markets with startups. When it comes to incubation, we seek out game-changing ideas and entrepreneurs. We are providing mentorship, resources, and market insight at every stage of their journey. As an example, we also invested in funds like E14, the fund that started out at MIT Media Lab, to gain early insight into disruptive trends and technology. It’s very important to be early-stage here. 

So SE Ventures has successfully today developed multiple unicorns in our portfolio. We’re working with several other high-growth companies, targeted to become future unicorns in key strategic areas. That is totally consistent with Schneider’s mission. 

It’s all about striking a balance to ensure that our relationship with startups are mutually beneficial, knowing when to provide guidance and resources when they need it, but also when to step back and allow them to thrive independently. 

Laurel: With that future lens on, what kind of trends or developments in the energy industry are you seeing, and how are you preparing for them? Are you getting a lot of that kind of excitement from those startups and venture fund ideas? 

Nadège: Yeah, absolutely. There are multiple strengths. You need to listen to startups, to innovators, to people coming up with bold ideas. I want to highlight a couple of those. The energy industry is set to see major shifts. We know it, and we want to be part of it. We discussed prosumers. Prosumer is something very important. A lot of people now understand their body, doing exercises, monitoring it; tomorrow, people will all monitor their energy. Those are prosumers. We believe that prosumers, that’s individuals and businesses, they’re central to the energy transition. And this is a key focal point for us. 

Another trend that we also discuss is digital and also AI. AI has the potential to be transformative as we build the new energy landscape. One example is AI-powered virtual power plants, or what we call VPP, that can optimize a large portfolio of distributed energy resources to ensure greater grid resiliency. Increasingly, AI can be at the heart of the modern electrical grid. So at Schneider Electric, we are watching those trends very carefully. We are listening to the external world, to our customers, and we are showing that we are positioning our solution and global hubs to best serve the needs of our customers. 

Laurel: Lastly, as a woman in a leadership position, could you tell us how you’ve navigated your career so far, and how others in the industry can create a more diverse and inclusive environment within their companies and teams? 

Nadège: An inclusive environment starts with us as leaders. Establishing a culture where we value differences, different opinions, believe in equal opportunity for everyone, and foster a sense of belonging, is something very important in this environment. It’s also important for organizations to create commitments around diversity, equity, and inclusion, and communicate them publicly so it drives accountability, and report on the progress and how we make it happen. 

I was truly fortunate to have started and grown my career at a company like Schneider Electric where I was surrounded by people who empowered me to be my best self. This is something that should drive all women to be the best of herself. It wasn’t always easy. I have learned how important it is to have a voice and to be bold, to speak up for what you are passionate about, and to use that passion to drive impact. These are values I also work to instill in my own teenage daughters, and I’m thrilled to see them finding their own passion within STEM. So the next generation is the driving force in shaping a more sustainable world, and it’s crucial that we focus on leaving the planet a better and more equal place where they can thrive. 

Laurel: Words to the wise. Thank you so much Nadege for joining us today on the Business Lab. 

Nadège: Thank you. 

Laurel: That was Nadège Petit, the chief innovation officer at Schneider Electric, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review. 


That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com. 

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening. 

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Unlocking secure, private AI with confidential computing

All of a sudden, it seems that AI is everywhere, from executive assistant chatbots to AI code assistants.

But despite the proliferation of AI in the zeitgeist, many organizations are proceeding with caution. This is due to the perception of the security quagmires AI presents. For the emerging technology to reach its full potential, data must be secured through every stage of the AI lifecycle including model training, fine-tuning, and inferencing.

This is where confidential computing comes into play. Vikas Bhatia, head of product for Azure Confidential Computing at Microsoft, explains the significance of this architectural innovation: “AI is being used to provide solutions for a lot of highly sensitive data, whether that’s personal data, company data, or multiparty data,” he says. “Confidential computing is an emerging technology that protects that data when it is in memory and in use. We see a future where model creators who need to protect their IP will leverage confidential computing to safeguard their models and to protect their customer data.”

Understanding confidential computing

“The tech industry has done a great job in ensuring that data stays protected at rest and in transit using encryption,” Bhatia says. “Bad actors can steal a laptop and remove its hard drive but won’t be able to get anything out of it if the data is encrypted by security features like BitLocker. Similarly, nobody can run away with data in the cloud. And data in transit is secure thanks to HTTPS and TLS, which have long been industry standards.”

But data in use, when data is in memory and being operated upon, has typically been harder to secure. Confidential computing addresses this critical gap—what Bhatia calls the “missing third leg of the three-legged data protection stool”—via a hardware-based root of trust.

Essentially, confidential computing ensures the only thing customers need to trust is the data running inside of a trusted execution environment (TEE) and the underlying hardware. “The concept of a TEE is basically an enclave, or I like to use the word ‘box.’ Everything inside that box is trusted, anything outside it is not,” explains Bhatia.

Until recently, confidential computing only worked on central processing units (CPUs). However, NVIDIA has recently brought confidential computing capabilities to the H100 Tensor Core GPU and Microsoft has made this technology available in Azure. This has the potential to protect the entire confidential AI lifecycle—including model weights, training data, and inference workloads.

“Historically, devices such as GPUs were controlled by the host operating system, which, in turn, was controlled by the cloud service provider,” notes Krishnaprasad Hande, Technical Program Manager at Microsoft. “So, in order to meet confidential computing requirements, we needed technological improvements to reduce trust in the host operating system, i.e., its ability to observe or tamper with application workloads when the GPU is assigned to a confidential virtual machine, while retaining sufficient control to monitor and manage the device. NVIDIA and Microsoft have worked together to achieve this.”

Attestation mechanisms are another key component of confidential computing. Attestation allows users to verify the integrity and authenticity of the TEE, and the user code within it, ensuring the environment hasn’t been tampered with. “Customers can validate that trust by running an attestation report themselves against the CPU and the GPU to validate the state of their environment,” says Bhatia.

Additionally, secure key management systems play a critical role in confidential computing ecosystems. “We’ve extended our Azure Key Vault with Managed HSM service which runs inside a TEE,” says Bhatia. “The keys get securely released inside that TEE such that the data can be decrypted.”

Confidential computing use cases and benefits

GPU-accelerated confidential computing has far-reaching implications for AI in enterprise contexts. It also addresses privacy issues that apply to any analysis of sensitive data in the public cloud. This is of particular concern to organizations trying to gain insights from multiparty data while maintaining utmost privacy.

Another of the key advantages of Microsoft’s confidential computing offering is that it requires no code changes on the part of the customer, facilitating seamless adoption. “The confidential computing environment we’re building does not require customers to change a single line of code,” notes Bhatia. “They can redeploy from a non-confidential environment to a confidential environment. It’s as simple as choosing a particular VM size that supports confidential computing capabilities.”

Some industries and use cases that stand to benefit from confidential computing advancements include:

  • Governments and sovereign entities dealing with sensitive data and intellectual property.
  • Healthcare organizations using AI for drug discovery and doctor-patient confidentiality.
  • Banks and financial firms using AI to detect fraud and money laundering through shared analysis without revealing sensitive customer information.
  • Manufacturers optimizing supply chains by securely sharing data with partners.

Further, Bhatia says confidential computing helps facilitate data “clean rooms” for secure analysis in contexts like advertising. “We see a lot of sensitivity around use cases such as advertising and the way customers’ data is being handled and shared with third parties,” he says. “So, in these multiparty computation scenarios, or ‘data clean rooms,’ multiple parties can merge in their data sets, and no single party gets access to the combined data set. Only the code that is authorized will get access.”

The current state—and expected future—of confidential computing

Although large language models (LLMs) have captured attention in recent months, enterprises have found early success with a more scaled-down approach: small language models (SLMs), which are more efficient and less resource-intensive for many use cases. “We can see some targeted SLM models that can run in early confidential GPUs,” notes Bhatia.

This is just the start. Microsoft envisions a future that will support larger models and expanded AI scenarios—a progression that could see AI in the enterprise become less of a boardroom buzzword and more of an everyday reality driving business outcomes. “We’re starting with SLMs and adding in capabilities that allow larger models to run using multiple GPUs and multi-node communication. Over time, [the goal is eventually] for the largest models that the world might come up with could run in a confidential environment,” says Bhatia.

Bringing this to fruition will be a collaborative effort. Partnerships among major players like Microsoft and NVIDIA have already propelled significant advancements, and more are on the horizon. Organizations like the Confidential Computing Consortium will also be instrumental in advancing the underpinning technologies needed to make widespread and secure use of enterprise AI a reality.

“We’re seeing a lot of the critical pieces fall into place right now,” says Bhatia. “We don’t question today why something is HTTPS. That’s the world we’re moving toward [with confidential computing], but it’s not going to happen overnight. It’s certainly a journey, and one that NVIDIA and Microsoft are committed to.”

Microsoft Azure customers can start on this journey today with Azure confidential VMs with NVIDIA H100 GPUs. Learn more here.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Housetraining robot dogs: How generative AI might change consumer IoT

As technology goes, the internet of things (IoT) is old: internet-connected devices outnumbered people on Earth around 2008 or 2009, according to a contemporary Cisco report. Since then, IoT has grown rapidly. Researchers say that by the early 2020s, estimates of the number of devices ranged anywhere from the low tens of billions to over 50 billion.

Currently, though, IoT is seeing unusually intense new interest for a long-established technology, even one still experiencing market growth. A sure sign of this buzz is the appearance of acronyms, such as AIoT and GenAIoT, or “artificial intelligence of things” and “generative artificial intelligence of things.”

What is going on? Why now? Examining potential changes to consumer IoT could provide some answers. Specifically, the vast range of areas where the technology finds home and personal uses, from smart home controls through smart watches and other wearables to VR gaming—to name just a handful. The underlying technological changes sparking interest in this specific area mirror those in IoT as a whole.

Rapid advances converging at the edge

IoT is much more than a huge collection of “things,” such as automated sensing devices and attached actuators to take limited actions. These devices, of course, play a key role. A recent IDC report estimated that all edge devices—many of them IoT ones—account for 20% of the world’s current data generation.

IoT, however, is much more. It is a huge technological ecosystem that encompasses and empowers these devices. This ecosystem is multi-layered, although no single agreed taxonomy exists.

Most analyses will include among the strata the physical devices themselves (sensors, actuators, and other machines with which these immediately interact); the data generated by these devices; the networking and communication technology used to gather and send the generated data to, and to receive information from, other devices or central data stores; and the software applications that draw on such information and other possible inputs, often to suggest or make decisions.

The inherent value from IoT is not the data itself, but the capacity to use it in order to understand what is happening in and around the devices and, in turn, to use these insights, where necessary, to recommend that humans take action or to direct connected devices to do so.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

❌