Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Readying business for the age of AI

Rapid advancements in AI technology offer unprecedented opportunities to enhance business operations, customer and employee engagement, and decision-making. Executives are eager to see the potential of AI realized. Among 100 c-suite respondents polled in WNS Analytics’ “The Future of Enterprise Data & AI” report, 76% say they are already implementing or planning to implement generative AI solutions. Among those same leaders, however, 67% report struggling with data migration, and others cite grappling with data quality, talent shortages, and data democratization issues. 

MIT Technology Review Insights recently had a conversation with Alex Sidgreaves, chief data officer at Zurich Insurance; Bogdan Szostek, chief data officer at Animal Friends Insurance; Shan Lodh, director of data platforms at Shawbrook Bank; and Gautam Singh, head of data, analytics, and AI at WNS Analytics, to discuss how enterprises can navigate the burgeoning era of AI.

AI across industries

There is no shortage of AI use cases across sectors. Retailers are tailoring shopping experiences to individual preferences by leveraging customer behavior data and advanced machine learning models. Traditional AI models can deliver personalized offerings. However, with generative AI, these personalized offerings are elevated by incorporating tailored communication that considers the customer’s persona, behavior, and past interactions. In insurance, by leveraging generative AI, companies can identify subrogation recovery opportunities that a manual handler might overlook, enhancing efficiency and maximizing recovery potential. Banking and financial services institutions are leveraging AI to bolster customer due diligence and enhance anti-money laundering efforts by leveraging AI-driven credit risk management practices. AI technologies are enhancing diagnostic accuracy through sophisticated image recognition in radiology, allowing for earlier and more precise detection of diseases while predictive analytics enable personalized treatment plans.

The core of successful AI implementation lies in understanding its business value, building a robust data foundation, aligning with the strategic goals of the organization, and infusing skilled expertise across every level of an enterprise.

  • “I think we should also be asking ourselves, if we do succeed, what are we going to stop doing? Because when we empower colleagues through AI, we are giving them new capabilities [and] faster, quicker, leaner ways of doing things. So we need to be true to even thinking about the org design. Oftentimes, an AI program doesn’t work, not because the technology doesn’t work, but the downstream business processes or the organizational structures are still kept as before.” Shan Lodh, director of data platforms, Shawbrook Bank

Whether automating routine tasks, enhancing customer experiences, or providing deeper insights through data analysis, it’s essential to define what AI can do for an enterprise in specific terms. AI’s popularity and broad promises are not good enough reasons to jump headfirst into enterprise-wide adoption. 

“AI projects should come from a value-led position rather than being led by technology,” says Sidgreaves. “The key is to always ensure you know what value you’re bringing to the business or to the customer with the AI. And actually always ask yourself the question, do we even need AI to solve that problem?”

Having a good technology partner is crucial to ensure that value is realized. Gautam Singh, head of data, analytics, and AI at WNS, says, “At WNS Analytics, we keep clients’ organizational goals at the center. We have focused and strengthened around core productized services that go deep in generating value for our clients.” Singh explains their approach, “We do this by leveraging our unique AI and human interaction approach to develop custom services and deliver differentiated outcomes.”

The foundation of any advanced technology adoption is data and AI is no exception. Singh explains, “Advanced technologies like AI and generative AI may not always be the right choice, and hence we work with our clients to understand the need, to develop the right solution for each situation.” With increasingly large and complex data volumes, effectively managing and modernizing data infrastructure is essential to provide the basis for AI tools. 

This means breaking down silos and maximizing AI’s impact involves regular communication and collaboration across departments, from marketing teams working with data scientists to understand customer behavior patterns to IT teams ensuring their infrastructure supports AI initiatives. 

  • “I would emphasize the growing customer’s expectations in terms of what they expect our businesses to offer them and to provide us a quality and speed of service. At Animal Friends, we see the generative AI potential to be the biggest with sophisticated chatbots and voice bots that can serve our customers 24/7 and deliver the right level of service, and being cost effective for our customers. Bogdan Szostek, chief data officer, Animal Friends

Investing in domain experts with insight into the regulations, operations, and industry practices is just as necessary in the success of deploying AI systems as the right data foundations and strategy. Continuous training and upskilling are essential to keep pace with evolving AI technologies.

Ensuring AI trust and transparency

Creating trust in generative AI implementation requires the same mechanisms employed for all emerging technologies: accountability, security, and ethical standards. Being transparent about how AI systems are used, the data they rely on, and the decision-making processes they employ can go a long way in forging trust among stakeholders. In fact, The Future of Enterprise Data & AI report cites 55% of organizations identify “building trust in AI systems among stakeholders” as the biggest challenge when scaling AI initiatives. 

“We need talent, we need communication, we need the ethical framework, we need very good data, and so on,” says Lodh. “Those things don’t really go away. In fact, they become even more necessary for generative AI, but of course the usages are more varied.” 

AI should augment human decision-making and business workflows. Guardrails with human oversight ensure that enterprise teams have access to AI tools but are in control of high-risk and high-value decisions.

“Bias in AI can creep in from almost anywhere and will do so unless you’re extremely careful. Challenges come into three buckets. You’ve got privacy challenges, data quality, completeness challenges, and then really training AI systems on data that’s biased, which is easily done,” says Sidgreaves. She emphasizes it is vital to ensure that data is up-to-date, accurate, and clean. High-quality data enhances the reliability and performance of AI models. Regular audits and data quality checks can help maintain the integrity of data.

An agile approach to AI implementation

ROI is always top of mind for business leaders looking to cash in on the promised potential of AI systems. As technology continues to evolve rapidly and the potential use cases of AI grow, starting small, creating measurable benchmarks, and adopting an agile approach can ensure success in scaling solutions. By starting with pilot projects and scaling successful initiatives, companies can manage risks and optimize resources. Sidgreaves, Szostek, and Lodh stress that while it may be tempting to throw everything at the wall and see what sticks, accessing the greatest returns from expanding AI tools means remaining flexible, strategic, and iterative. 

In insurance, two areas where AI has a significant ROI impact are risk and operational efficiency. Sidgreaves underscores that reducing manual processes is essential for large, heritage organizations, and generative AI and large language models (LLMs) are revolutionizing this aspect by significantly diminishing the need for manual activities.

To illustrate her point, she cites a specific example: “Consider the task of reviewing and drafting policy wording. Traditionally, this process would take an individual up to four weeks. However, with LLMs, this same task can now be completed in a matter of seconds.”  

Lodh adds that establishing ROI at the project’s onset and implementing cross-functional metrics are crucial for capturing a comprehensive view of a project’s impact. For instance, using LLMs for writing code is a great example of how IT and information security teams can collaborate. By assessing the quality of static code analysis generated by LLMs, these teams can ensure that the code meets security and performance standards.

“It’s very hard because technology is changing so quickly,” says Szostek. “We need to truly apply an agile approach, do not try to prescribe all the elements of the future deliveries in 12, 18, 24 months. We have to test and learn and iterate, and also fail fast if that’s needed.” 

Navigating the future of the AI era 

The rapid evolution of the digital age continues to bring immense opportunities for enterprises globally, from the c-suite to the factory floor. With no shortage of use cases and promises to boost efficiencies, drive innovation, and improve customer and employee experiences, few business leaders dismiss the proliferation of AI as mere hype. However, the successful and responsible implementation of AI requires a careful balance of strategy, transparency, and robust data privacy and security measures.

  • “It’s really easy as technology people to be driven by the next core thing, but we would have to be solving a business problem. So the key is to always ensure you know what value you’re bringing to the business or to the customer with the AI. And actually always ask yourself the question, do we even need AI to solve that problem?” — Alex Sidgreaves, chief data officer, Zurich Insurance

Fully harnessing the power of AI while maintaining trust means defining clear business values, ensuring accountability, managing data privacy, balancing innovation with ethical use, and staying ahead of future trends. Enterprises must remain vigilant and adaptable, committed to ethical practices and an agile approach to thrive in this rapidly changing business landscape.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

The rise of the data platform for hybrid cloud

Whether pursuing digital transformation, exploring the potential of AI, or simply looking to simplify and optimize existing IT infrastructure, today’s organizations must do this in the context of increasingly complex multi-cloud environments. These complicated architectures are here to stay—2023 research by Enterprise Strategy Group, for example, found that 87% of organizations expect their applications to be distributed across still more locations in the next two years.

Scott Sinclair, practice director at Enterprise Strategy Group, outlines the problem: “Data is becoming more distributed. Apps are becoming more distributed. The typical organization has multiple data centers, multiple cloud providers, and umpteen edge locations. Data is all over the place and continues to be created at a very rapid rate.”

Finding a way to unify this disparate data is essential. In doing so, organizations must balance the explosive growth of enterprise data; the need for an on-premises, cloud-like consumption model to mitigate cyberattack risks; and continual pressure to cut costs and improve performance.

Sinclair summarizes: “What you want is something that can sit on top of this distributed data ecosystem and present something that is intuitive and consistent that I can use to leverage the data in the most impactful way, the most beneficial way to my business.”

For many, the solution is an overarching software-defined, virtualized data platform that delivers a common data plane and control plane across hybrid cloud environments. Ian Clatworthy, head of data platform product marketing at Hitachi Vantara, describes a data platform as “an integrated set of technologies that meets an organization’s data needs, enabling storage and delivery of data, the governance of data, and the security of data for a business.”

Gartner projects that these consolidated data storage platforms will constitute 70% of file and object storage by 2028, doubling from 35% in 2023. The research firm underscores that “Infrastructure and operations leaders must prioritize storage platforms to stay ahead of business demands.”

A transitional moment for enterprise data

Historically, organizations have stored their various types of data—file, block, object—in separate silos. Why change now? Because two main drivers are rendering traditional data storage schemes inadequate for today’s business needs: digital transformation and AI.

As digital transformation initiatives accelerate, organizations are discovering that having distinct storage solutions for each workload is inadequate for their escalating data volumes and changing business landscapes. The complexity of the modern data estate hinders many efforts toward change.

Clatworthy says that when organizations move to hybrid cloud environments, they may find, for example, that they have mainframe or data center data stored in one silo, block storage running on an appliance, apps running file storage, another silo for public cloud, and a separate VMware stack. The result is increased complexity and
cost in their IT infrastructure, as well as reduced flexibility and efficiency.

Then, Clatworthy adds, “When we get to the world of generative AI that’s bubbling around the edges, and we’re going to have this mass explosion of data, we need to simplify how that data is managed so that applications can consume it. That’s where a platform comes in.”

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Advancing to adaptive cloud

For many years now, cloud solutions have helped organizations streamline their operations, increase their scalability, and reduce costs. Yet, enterprise cloud investment has been fragmented, often lacking a coherent organization-wide approach. In fact, it’s not uncommon for various teams across an organization to have spun up their own cloud projects, adopting a wide variety of cloud strategies and providers, from public and hybrid to multi-cloud and edge computing.

The problem with this approach is that it often leads to “a sprawling set of systems and disparate teams working on these cloud systems, making it difficult to keep up with the pace of innovation,” says Bernardo Caldas, corporate vice president of Azure Edge product management at Microsoft. In addition to being an IT headache, a fragmented cloud environment leads to technological and organizational repercussions.

A complex multi-cloud deployment can make it difficult for IT teams to perform mission-critical tasks, such as applying security patches, meeting regulatory requirements, managing costs, and accessing data for data analytics. Configuring and securing these types of environments is a challenging and time-consuming task. And ad hoc cloud deployments often culminate in systems incompatibility when one-off pilots are ready to scale or be combined with existing products.

Without a common IT operations and application development platform, teams can’t share lessons learned or pool important resources, which tends to cause them to become increasingly siloed. “People want to do more with their data, but if their data is trapped and isolated in these different systems, it can make it really hard to tap into the data for insights and to accelerate progress,” says Caldas.

As the pace of change accelerates, however, many organizations are adopting a new adaptive cloud approach—one that will enable them to respond quickly to evolving consumer demands and market fluctuations while simplifying the management of their complex cloud environments.

An adaptive strategy for success

Heralding a departure from yesteryear’s fragmented cloud environments, an adaptive cloud approach unites sprawling systems, disparate silos, and distributed sites into a single operations, development, security, application, and data model. This unified approach empowers organizations to glean value from cloud-native technologies, open source software such as Linux, and AI across hybrid, multi-cloud, edge, and IoT.

“You’ve got a lot of legacy software out there, and for the most part, you don’t want to change production environments,” says David Harmon, director of software engineering at AMD. “Nobody wants to change code. So while CTOs and developers really want to take advantage of all the hardware changes, they want to do nothing to their code base if possible, because that change is very, very expensive.”

An adaptive cloud approach answers this challenge by taking an agnostic approach to the environments it brings together on a single control plane. By seamlessly collecting disparate computing environments, including those that run outside of hyperscale data centers, the control plane creates greater visibility across thousands of assets, simplifies security enforcement, and allows for easier management.

An adaptive cloud approach enables unified management of disparate systems and resources, leading to improved oversight and control. An adaptive approach also creates scalability, as it allows organizations to meet the fluctuating demands of a business without the risk of over-provisioning or under-provisioning resources.

There are also clear business advantages to embracing an adaptive cloud approach. Consider, for example, an operational technology team that deploys an automation system to accelerate a factory’s production capabilities. In a fragmented and distributed environment, systems often struggle to communicate. But in an adaptive cloud environment, a factory’s automation system can easily be connected to the organization’s customer relationship management system, providing sales teams with real-time insights into supply-demand fluctuations.

A united platform is not only capable of bringing together disparate systems but also of connecting employees from across functions, from sales to engineering. By sharing an interconnected web of cloud-native tools, a workforce’s collective skills and knowledge can be applied to initiatives across the organization—a valuable asset in today’s resource-strapped and talent-scarce business climate.

Using cloud-native technologies like Kubernetes and microservices can also expedite the development of applications across various environments, regardless of an application’s purpose. For example, IT teams can scale applications from massive cloud platforms to on-site production without complex rewrites. Together, these capabilities “propel innovation, simplify complexity, and enhance the ability to respond to business opportunities,” says Caldas.

The AI equation

From automating mundane processes to optimizing operations, AI is revolutionizing the way businesses work. In fact, the market for AI reached $184 billion in 2024—a staggering increase from nearly $50 billion in 2023, and it is expected to surpass $826 billion in 2030.

But AI applications and models require high-quality data to generate high-quality outputs. That’s a challenging feat when data sets are trapped in silos across distributed environments. Fortunately, an adaptive cloud approach can provide a unified data platform for AI initiatives.

“An adaptive cloud approach consolidates data from various locations in a way that’s more useful for companies and creates a robust foundation for AI applications,” says Caldas. “It creates a unified data platform that ensures that companies’ AI tools have access to high-quality data to make decisions.”

Another benefit of an adaptive cloud approach is the ability to tap into the capabilities of innovative tools such as Microsoft Copilot in Azure. Copilot in Azure is an AI companion that simplifies how IT teams operate and troubleshoot apps and infrastructure. By leveraging large language models to interact with an organization’s data, Copilot allows for deeper exploration and intelligent assessment of systems within a unified management framework.

Imagine, for example, the task of troubleshooting the root cause of a system anomaly. Typically, IT teams must sift through thousands of logs, exchanging a series of emails with colleagues, and reading documentation for answers. Copilot in Azure, however, can cut through this complexity by easing anomaly detection of unanticipated system changes while, at the same time, providing recommendations for speedy resolution.

“Organizations can now interact with systems using chat capabilities, ask questions about environments, and gain real insights into what’s happening across the heterogenous environments,” says Caldas.

An adaptive approach for the technology future

Today’s technology environments are only increasing in complexity. More systems, more data, more applications—together, they form a massive sprawling infrastructure. But proactively reacting to change, be it in market trends or customer needs, requires greater agility and integration across the organization. The answer: an adaptive approach. A unified platform for IT operations and management, applications, data, and security can consolidate the disparate parts of a fragmented environment in ways that not only ease IT management and application development but also deliver key business benefits, from faster time to market to AI efficiencies, at a time when organizations must move swiftly to succeed.

Microsoft Azure and AMD meet you where you are on your cloud journey. Learn more about an adaptive cloud approach with Azure.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

A playbook for crafting AI strategy

Giddy predictions about AI, from its contributions to economic growth to the onset of mass automation, are now as frequent as the release of powerful new generative AI models. The consultancy PwC, for example, predicts that AI could boost global gross domestic product (GDP) 14% by 2030, generating US $15.7 trillion.

Forty percent of our mundane tasks could be automated by then, claim researchers at the University of Oxford, while Goldman Sachs forecasts US $200 billion in AI investment by 2025. “No job, no function will remain untouched by AI,” says SP Singh, senior vice president and global head, enterprise application integration and services, at technology company Infosys.

While these prognostications may prove true, today’s businesses are finding major hurdles when they seek to graduate from pilots and experiments to enterprise-wide AI deployment. Just 5.4% of US businesses, for example, were using AI to produce a product or service in 2024.

Moving from initial forays into AI use, such as code generation and customer service, to firm-wide integration depends on strategic and organizational transitions in infrastructure, data governance, and supplier ecosystems. As well, organizations must weigh uncertainties about developments in AI performance and how to measure return on investment.

If organizations seek to scale AI across the business in coming years, however, now is the time to act. This report explores the current state of enterprise AI adoption and offers a playbook for crafting an AI strategy, helping business leaders bridge the chasm between ambition and execution. Key findings include the following:

AI ambitions are substantial, but few have scaled beyond pilots. Fully 95% of companies surveyed are already using AI and 99% expect to in the future. But few organizations have graduated beyond pilot projects: 76% have deployed AI in just one to three use cases. But because half of companies expect to fully deploy AI across all business functions within two years, this year is key to establishing foundations for enterprise-wide AI.

AI readiness spending is slated to rise significantly. Overall, AI spending in 2022 and 2023 was modest or flat for most companies, with only one in four increasing their spending by more than a quarter. That is set to change in 2024, with nine in ten respondents expecting to increase AI spending on data readiness (including platform modernization, cloud migration, and data quality) and in adjacent areas like strategy, cultural change, and business models. Four in ten expect to increase spending by 10 to 24%, and one-third expect to increase spending by 25 to 49%.

Data liquidity is one of the most important attributes for AI deployment. The ability to seamlessly access, combine, and analyze data from various sources enables firms to extract relevant information and apply it effectively to specific business scenarios. It also eliminates the need to sift through vast data repositories, as the data is already curated and tailored to the task at hand.

Data quality is a major limitation for AI deployment. Half of respondents cite data quality as the most limiting data issue in deployment. This is especially true for larger firms with more data and substantial investments in legacy IT infrastructure. Companies with revenues of over US $10 billion are the most likely to cite both data quality and data infrastructure as limiters, suggesting that organizations presiding over larger data repositories find the problem substantially harder.

Companies are not rushing into AI. Nearly all organizations (98%) say they are willing to forgo being the first to use AI if that ensures they deliver it safely and securely. Governance, security, and privacy are the biggest brake on the speed of AI deployment, cited by 45% of respondents (and a full 65% of respondents from the largest companies).

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Reimagining cloud strategy for AI-first enterprises

The rise of generative artificial intelligence (AI), natural language processing, and computer vision has sparked lofty predictions: AI will revolutionize business operations, transform the nature of knowledge work, and boost companies’ bottom lines and the larger global economy by trillions of dollars.

Executives and technology leaders are eager to see these promises realized, and many are enjoying impressive results of early AI investments. Balakrishna D.R. (Bali), executive vice president, global services head, AI and industry verticals at Infosys, says that generative AI is already proving game-changing for tasks such as knowledge management, search and summarization, software development, and customer service across sectors such as financial services, retail, health care, and automotive.

Realizing AI’s full potential on a mass scale will require more than just executives’ enthusiasm; becoming a truly AI-first enterprise will require a significant, sustained investment in cloud infrastructure and strategy. In 2024, the cloud has evolved beyond its initial purpose as a storage tool and cost saver to become a crucial driver of innovation, transformation, and disruption. Now, with AI in the mix, enterprises are looking to the cloud to support large language models (LLMs) to maximize R&D performance and prevent cybersecurity attacks, among other high-impact use cases.

A 2023 report by Infosys looks at how prepared companies are to realize the combined potential of cloud and AI. To further assess this state of readiness, MIT Technology Review Insights and Infosys surveyed 500 business leaders across industries such as IT, manufacturing, financial services, and consumer goods about how their organizations are thinking about—and acting upon—an integrated cloud and AI strategy.

This research found that most companies are still experimenting and preparing their infrastructure landscape for AI from a cloud perspective—and many are planning additional investments to accelerate their progress.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Building supply chain resilience with AI

If the last five years have taught businesses with complex supply chains anything, it is that resilience is crucial. In the first three months of the covid-19 pandemic, for example, supply-chain leader Amazon grew its business 44%. Its investments in supply chain resilience allowed it to deliver when its competitors could not, says Sanjeev Maddila, worldwide head of supply chain solutions at Amazon Web Services (AWS), increasing its market share and driving profits up 220%. A resilient supply chain ensures that a company can meet its customers’ needs despite inevitable disruption.

Today, businesses of all sizes must deliver to their customers against a backdrop of supply chain disruptions, with technological changes, shifting labor pools, geopolitics, and climate change adding new complexity and risk at a global scale. To succeed, they need to build resilient supply chains: fully digital operations that prioritize customers and their needs while establishing a fast, reliable, and sustainable delivery network.

The Canadian fertilizer company Nutrien, for example, operates two dozen manufacturing and processing facilities spread across the globe and nearly 2,000 retail stores in the Americas and Australia. To collect underutilized data from its industrial operations, and gain greater visibility into its supply chain, the company relies on a combination of cloud technology and artificial intelligence/machine learning (AI/ML) capabilities.

“A digital supply chain connects us from grower to manufacturer, providing visibility throughout the value chain,” says Adam Lorenz, senior director for strategic fleet and indirect procurement at Nutrien. This visibility is critical when it comes to navigating the company’s supply chain challenges, which include seasonal demands, weather dependencies, manufacturing capabilities, and product availability. The company requires real-time visibility into its fleets, for example, to identify the location of assets, see where products are moving, and determine inventory requirements.

Currently, Nutrien can locate a fertilizer or nutrient tank in a grower’s field and determine what Nutrien products are in it. By achieving that “real-time visibility” into a tank’s location and a customer’s immediate needs, Lorenz says the company “can forecast where assets are from a fill-level perspective and plan accordingly.” In turn, Nutrien can respond immediately to emerging customer needs, increasing company revenue while enhancing customer satisfaction, improving inventory management, and optimizing supply chain operations.

“For us, it’s about starting with data creation and then adding a layer of AI on top to really drive recommendations,” says Lorenz. In addition to improving product visibility and asset utilization, Lorenz says that Nutrien plans to add AI capabilities to its collaboration platforms that will make it easier for less-tech-savvy customers to take advantage of self-service capabilities and automation that accelerates processes and improves compliance with complex policies.

To meet and exceed customer expectations with differentiated service, speed, and reliability, all companies need to similarly modernize their supply chain operations. The key to doing so—and to increasing organizational resilience and sustainability—will be applying AI/ML to their extensive operational data in the cloud.

Resilience as a business differentiator

Like Nutrien, a wide variety of organizations from across industries are discovering the competitive advantages of modernizing their supply chains. A pharmaceutical company that aggregates its supply chain data for greater end-to-end visibility, for example, can provide better product tracking for critically ill customers. A retail startup undergoing meteoric growth can host its workloads in the cloud to support sudden upticks in demand while minimizing operating costs. And a transportation company can achieve inbound supply chain savings by evaluating the total distance its fleet travels to reduce mileage costs and CO2 emissions.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Transforming the energy industry through disruptive innovation

In the rhythm of our fast-paced lives, most of us don’t stop to think about where electricity comes from or or how it powers homes, industries, and the technologies that connect people around the world. As populations and economies grow, energy demands are set to increase by 50% by 2050–challenging century-old energy systems to adapt with innovative and agile solutions. This comes at a time when climate change is making its presence felt more than ever; 2023 marked the warmest year since records began in 1850, crossing the 1.5 degrees global warming threshold. 

Nadège Petit of Schneider Electric confronts this challenge head-on, saying, “We have no choice but to change the way we produce, distribute, and consume energy, and do it sustainably to tackle both the energy and climate crises.” She explains further that digital technologies are key to navigating this path, and Schneider Electric’s AI-enabled IoT solutions can empower customers to take control of their energy use, enhancing efficiency and resiliency.

Petit acknowledges the complexity of crafting and implementing robust sustainability strategies. She highlights the importance of taking an incremental stepwise approach, and adopting open standards, to drive near-term impact while laying the foundation for long-term decarbonization goals. 

Because the energy landscape is evolving rapidly, it’s critical to not just keep pace but to anticipate and shape the future. Much like actively managing health through food and fitness regimes, energy habits need to be monitored as well. This can transform passive consumers to become energy prosumers–those that produce, consume, and manage energy. Petit’s vision is one where “buildings and homes generate their own energy from renewable sources, use what’s needed, and feed the excess back to the grid.”  

To catalyze this transformation, Petit underscores the power of collaboration and innovation. For example, Schneider Electric’s SE Ventures invests in startups to provide new perspectives and capabilities to accelerate sustainable energy solutions. 

“It’s all about striking a balance to ensure that our relationship with startups are mutually beneficial, knowing when to provide guidance and resources when they need it, but also when to step back and allow them to thrive independently,” says Petit. 

This episode of Business Lab is produced in partnership with Schneider Electric. 

Full transcript 

Laurel Ruma: From MIT Technology Review, I’m Laurel Ruma, and this is Business Lab, the show that helps business leaders make sense of new technologies coming out of the lab and into the marketplace. 

Our topic today is disruptive innovation in the energy industry and beyond. We use energy every day. It powers our homes, buildings, economies, and lifestyles, but where it came from or how our use affects the global energy ecosystem is changing, and our energy ecosystem needs to change with it.

 My guest is Nadège Petit, the chief innovation officer at Schneider Electric. 

This podcast is produced in partnership with Schneider Electric. 

Welcome, Nadège. 

Nadège Petit: Hi, everyone. Thank you for having me today. 

Laurel: Well, we’re glad you’re here. 

Let’s start off with a simple question to build that context around our conversation. What is Schneider Electric’s mission? And as the chief innovation officer leading its Innovation at the Edge team, what are some examples of what the team is working on right now? 

Nadège: Let me set up this scene a little bit here. In recent years, our world has been shaped by a series of significant disruptions. The pandemic has driven a sharp increase in the demand of digital tools and technologies, with a projected 6x growth in the number of IoT devices between 2020 and 2030, and a 140x growth in IP traffic between 2020 and 2040. 

Simultaneously, there has been a parallel acceleration in energy demands. Electrical consumption has been increasing by 5,000 terawatt hours every 10 years over the past two decades. This is set to double in the next 10 years and then quadruple by 2040 This is amplified by the most severe energy crisis that we are facing now since the 1970s. Over 80% of carbon emissions are coming from energy, so electrifying the world and decarbonizing [the] energy sector is a must. We cannot overlook the climate crisis while meeting these energy demands. In 2023, the global average temperature was the warmest on record since 1850, surpassing the 1.5 degrees global warming limit. So, we have no choice but to change the way we produce, distribute, and consume energy, and do it sustainably to tackle both the energy and climate crises. This gives us a rare opportunity to reimagine and create a clean energy future we want. 

Schneider Electric as an energy management and digital automation company, aims to be the digital partner for sustainability and efficiency for our customers. With end-to-end experience in the energy sector, we are uniquely positioned to help customers digitize, electrify, and deploy sustainable technologies to help them progress toward net-zero. 

As for my role, we know that innovation is pivotal to drive the energy transition. The Innovation at the Edge team leads the way in discovering, developing, and delivering disruptive technologies that will define a more digital, electric, and sustainable energy landscape. We function today as an innovation engine, bridging internal and external innovation, to introduce new solutions, services and businesses to the market. Ultimately, we are crafting the future businesses for Schneider Electric in this sector. And to do this, we nourish a culture that recognizes and celebrates innovation. We welcome new ideas, consider new perspectives inside and outside the organization, and seek out unusual combinations that can kindle revolutionary ideas. We like to think of ourselves as explorers and forces of change, looking for and solving new customer problems. So curiosity and daring to disrupt are in our DNA. And this is the true spirit of Innovation at the Edge at Schneider Electric. 

Laurel: And it’s clear that urgency certainly comes out, especially for enterprises. Because they’re trying to build strong sustainability strategies to not just reach those environmental, social, and governance, or ESG, goals and targets; but also to improve resiliency and efficiency. What’s the role of digital technologies when we think about this all together in enabling a more sustainable future? 

Nadège: We see a sustainable future, and our goal is to enable the shift to an all-electric and all-digital world. That kind of transition isn’t possible without digital technology. We see digital as a key enabler of sustainability and decarbonization. The technology is already available now, it’s a matter of acceleration and adoption of it. And all of us, we have a role to play here. 

At Schneider Electric, we have built a suite of solutions that enable customers to accelerate their sustainability journey. Our flagship suite of IoT-enabled solution infrastructure empowers customers to monitor energy, carbon, and resource usage; and enabling them to implement strategies for efficiency, optimization, and resiliency. We have seen remarkable success stories of clients leveraging our digital EcoStruxure solution in buildings, utilities, data centers, hospitality, healthcare, and more, all over the place. If I were to take one example, I can take the example of PG&E customer, a leading California utility that everybody knows; they are using our EcoStruxure distributed energy resources management system, we call it DERMS, to manage grid reliability more effectively, which is crucial in the face of extreme weather events impacting the grid and consumers.

Schneider has also built an extensive ecosystem of partners because we do need to do it at scale together to accelerate digital transformation for customers. We also invest in cutting-edge technologies that make need-based collaboration and co-innovation possible. It’s all about working together towards one common goal. Ultimately the companies that embrace digital transformation will be the ones that will thrive on disruption. 

Laurel: It’s clear that building a strong sustainability strategy and then following through on the implementation does take time, but addressing climate change requires immediate action. How does your team at Schneider Electric as a whole work to balance those long-term commitments and act with urgency in the short term? It sounds like that internal and external innovation opportunity really could play a role here. 

Nadège: Absolutely. You’re absolutely right. We already have many of the technologies that will take us to net-zero. For example, 70% of CO2 emissions can be removed with existing technologies. By deploying electrification and digital solutions, we can get to our net-zero goals much faster. We know it’s a gradual process and as you already discussed previously, we do need to accelerate the adoption of it. By taking an incremental stepwise approach, we can drive near-term impact while laying the foundation for long-term decarbonization goals. 

Building on the same example of PG&E, which I referenced earlier; through our collaboration, piece by piece progressively, we are building the backbone of a sustainable, digitized, and reliable energy future in California with the deployment of EcoStruxure DERMS. As grid reliability and flexibility become more important, DERMS enable us to keep pace with 21st-century grid demands as they evolve. 

Another critical component of moving fast is embracing open systems and platforms, creating an interoperable ecosystem. By adopting open standards, you empower a wide range of experts to collaborate together, including startups, large organizations, senior decision-makers, and those on the ground. This future-proof investment ensures flexible and scalable solutions, that avoids expensive upgrades in the future and obsolescence. That is why at Innovation at the Edge we’re creating a win-win partnership to push market adoption of the innovative technology available today, but laying the foundation of an even more innovative tomorrow. Innovation at the Edge today provides the space to nurture those ideas, collaborate together, iterate, learn, and grow at pace. 

Laurel: What’s your strategy for investing in, and then adopting those disruptive technologies and business models, especially when you’re trying to build that kind of innovation for tomorrow? 

Nadège: I strongly believe innovation is a key driver of the energy transition. It’s very hard to create the right conditions for consistent innovation, as we discuss short-term and long-term. I want to quote again the famous book from Clayton Christenson, The Innovator’s Dilemma, about how big organizations can get so good at what they are already doing that they struggle to adapt as the market changes. And we are in this dilemma. So we do need to stay ahead. Leaders need to grasp disruptive technology, put customers first, foster innovation, and tackle emerging challenges head on. The phrase “that’s no longer how we do it,” really resonates with me as I look at the role of innovation in the energy space. 

At Schneider, innovation is more than just a buzzword. It’s our strategy for navigating the energy transition. We are investing in truly new and disruptive ideas, tech, and business models, taking the risk and the challenge. We complement our current offering constantly, and we include the new prosumer business that we’re building, and this is pivotal to accelerate the energy transition. We foster open innovation through investment and incubation of cutting-edge technology in energy management, electrical mobility, industrial automation, cybersecurity, artificial intelligence, sustainability, and other topics that will help to go through this innovation. I also can quote some joint ventures that we are creating with partners like GreenStruxure or AlphaStruxure. Those are offering energy-as-a-service solutions, so a new business model enabling organizations to leverage existing technology to achieve decarbonization at scale. As an example, GreenStruxure is helping Bimbo Bakeries move closer to net-zero with micro-grid system at six of their locations. This will provide 20% of Bimbo Bakeries’ USA energy usage and save an estimate of 1,700 tons of CO2 emission per year. 

Laurel: Yeah, that’s certainly remarkable. Following up on that, how does Schneider Electric define prosumer and how does that audience actually fit into Schneider Electric’s strategy when you’re trying to develop these new models? 

Nadège: Prosumer is my favorite word. Let’s redefine it again. Everybody’s speaking of prosumer, but what is prosumer? Prosumer refers to consumers that are actively involved in energy management; producing and consuming their own energy using technologies like solar panels, EV chargers, EV batteries, and EV storage. This is all digitally enabled. So everybody now, the customers, industrial customers, want to understand their energy. So becoming a prosumer comes with perks like lower energy bills. Fantastic, right? Increase independence, clean energy use, and potential compensation from utility providers. It’s beneficial to all of us; it’s beneficial to our planet, it’s beneficial to the decarbonization of the world. Imagine a future where buildings and homes generate their own energy from renewable sources, use what’s needed, and feed the excess back to the grid. This is a fantastic opportunity, and the interest in this is massive. 

To give you some figures; in 2019 we saw 100 gigawatts of new solar PV capacities deployed globally, and by last year this number had nearly quadrupled. So transformation is happening now. Electric vehicles, as an example, their sales have been soaring too, with a projected 14 million sales by 2023, six times the 2019 number. These technologies are already making a dent in emissions and the energy crisis. 

However, the journey to become a prosumer is complex. It’s all about scale and adoption, and it involves challenges with asset integration, grid modernization, regulatory compliance. So we are all part of this ecosystem, and it takes a lot of leadership to make it happen. So at Innovation at the Edge, we’re creating an ecosystem of solutions to streamline the prosumer journey from education and management to purchasing, installation, management, and maintenance of these new distributed resources. What we are doing, we are bringing together internal innovations that we already have in-house at Schneider Electric, like micro-grid, EV charging solutions, battery storage, and more with external innovation from portfolio companies. I can quote companies like Qmerit, EnergySage, EV Connect, Uplight, and AutoGrid, and we deliver end-to-end solutions from grid to prosumer. 

I want to insist one more time, it’s very important to accelerate and to be part of this accelerated adoption. These efforts are not just about strengthening our business, they’re about simplifying the energy ecosystem and moving the industry toward greater sustainability. It’s a collaborative journey that’s shaping the future of energy, and I’m very excited about this. 

Laurel: Focusing on that kind of urgency, innovation in large companies can be hampered by bureaucracy and go slow. What are some best practices for innovation without all of those delays? 

Nadège: Schneider Electric, we are not strangers to innovation, specifically in the energy management and industrial automation space. But to really push the envelope, we look beyond our walls for fresh ideas and expertise. And this is where SE Ventures comes in. It’s our one-billion-euro venture capital fund, from which we make bold bets and bring disruptive ideas to life by supporting and investing in startups that complement our current offering and explore future business. So based in Silicon Valley, but with a global reach, SE Ventures leverages our market knowledge and customer proximity to drive near-term value and commercial relationships with our businesses, customers, and partners. 

We also focus on partnership and incubation. So through partnerships with startups, we accelerate time to market. We accelerate the R&D roadmap and explore new products, new markets with startups. When it comes to incubation, we seek out game-changing ideas and entrepreneurs. We are providing mentorship, resources, and market insight at every stage of their journey. As an example, we also invested in funds like E14, the fund that started out at MIT Media Lab, to gain early insight into disruptive trends and technology. It’s very important to be early-stage here. 

So SE Ventures has successfully today developed multiple unicorns in our portfolio. We’re working with several other high-growth companies, targeted to become future unicorns in key strategic areas. That is totally consistent with Schneider’s mission. 

It’s all about striking a balance to ensure that our relationship with startups are mutually beneficial, knowing when to provide guidance and resources when they need it, but also when to step back and allow them to thrive independently. 

Laurel: With that future lens on, what kind of trends or developments in the energy industry are you seeing, and how are you preparing for them? Are you getting a lot of that kind of excitement from those startups and venture fund ideas? 

Nadège: Yeah, absolutely. There are multiple strengths. You need to listen to startups, to innovators, to people coming up with bold ideas. I want to highlight a couple of those. The energy industry is set to see major shifts. We know it, and we want to be part of it. We discussed prosumers. Prosumer is something very important. A lot of people now understand their body, doing exercises, monitoring it; tomorrow, people will all monitor their energy. Those are prosumers. We believe that prosumers, that’s individuals and businesses, they’re central to the energy transition. And this is a key focal point for us. 

Another trend that we also discuss is digital and also AI. AI has the potential to be transformative as we build the new energy landscape. One example is AI-powered virtual power plants, or what we call VPP, that can optimize a large portfolio of distributed energy resources to ensure greater grid resiliency. Increasingly, AI can be at the heart of the modern electrical grid. So at Schneider Electric, we are watching those trends very carefully. We are listening to the external world, to our customers, and we are showing that we are positioning our solution and global hubs to best serve the needs of our customers. 

Laurel: Lastly, as a woman in a leadership position, could you tell us how you’ve navigated your career so far, and how others in the industry can create a more diverse and inclusive environment within their companies and teams? 

Nadège: An inclusive environment starts with us as leaders. Establishing a culture where we value differences, different opinions, believe in equal opportunity for everyone, and foster a sense of belonging, is something very important in this environment. It’s also important for organizations to create commitments around diversity, equity, and inclusion, and communicate them publicly so it drives accountability, and report on the progress and how we make it happen. 

I was truly fortunate to have started and grown my career at a company like Schneider Electric where I was surrounded by people who empowered me to be my best self. This is something that should drive all women to be the best of herself. It wasn’t always easy. I have learned how important it is to have a voice and to be bold, to speak up for what you are passionate about, and to use that passion to drive impact. These are values I also work to instill in my own teenage daughters, and I’m thrilled to see them finding their own passion within STEM. So the next generation is the driving force in shaping a more sustainable world, and it’s crucial that we focus on leaving the planet a better and more equal place where they can thrive. 

Laurel: Words to the wise. Thank you so much Nadege for joining us today on the Business Lab. 

Nadège: Thank you. 

Laurel: That was Nadège Petit, the chief innovation officer at Schneider Electric, who I spoke with from Cambridge, Massachusetts, the home of MIT and MIT Technology Review. 


That’s it for this episode of Business Lab. I’m your host, Laurel Ruma. I’m the global director of Insights, the custom publishing division of MIT Technology Review. We were founded in 1899 at the Massachusetts Institute of Technology, and you can find us in print, on the web, and at events each year around the world. For more information about us and the show, please check out our website at technologyreview.com. 

This show is available wherever you get your podcasts. If you enjoyed this episode, we hope you’ll take a moment to rate and review us. Business Lab is a production of MIT Technology Review. This episode was produced by Giro Studios. Thanks for listening. 

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Unlocking secure, private AI with confidential computing

All of a sudden, it seems that AI is everywhere, from executive assistant chatbots to AI code assistants.

But despite the proliferation of AI in the zeitgeist, many organizations are proceeding with caution. This is due to the perception of the security quagmires AI presents. For the emerging technology to reach its full potential, data must be secured through every stage of the AI lifecycle including model training, fine-tuning, and inferencing.

This is where confidential computing comes into play. Vikas Bhatia, head of product for Azure Confidential Computing at Microsoft, explains the significance of this architectural innovation: “AI is being used to provide solutions for a lot of highly sensitive data, whether that’s personal data, company data, or multiparty data,” he says. “Confidential computing is an emerging technology that protects that data when it is in memory and in use. We see a future where model creators who need to protect their IP will leverage confidential computing to safeguard their models and to protect their customer data.”

Understanding confidential computing

“The tech industry has done a great job in ensuring that data stays protected at rest and in transit using encryption,” Bhatia says. “Bad actors can steal a laptop and remove its hard drive but won’t be able to get anything out of it if the data is encrypted by security features like BitLocker. Similarly, nobody can run away with data in the cloud. And data in transit is secure thanks to HTTPS and TLS, which have long been industry standards.”

But data in use, when data is in memory and being operated upon, has typically been harder to secure. Confidential computing addresses this critical gap—what Bhatia calls the “missing third leg of the three-legged data protection stool”—via a hardware-based root of trust.

Essentially, confidential computing ensures the only thing customers need to trust is the data running inside of a trusted execution environment (TEE) and the underlying hardware. “The concept of a TEE is basically an enclave, or I like to use the word ‘box.’ Everything inside that box is trusted, anything outside it is not,” explains Bhatia.

Until recently, confidential computing only worked on central processing units (CPUs). However, NVIDIA has recently brought confidential computing capabilities to the H100 Tensor Core GPU and Microsoft has made this technology available in Azure. This has the potential to protect the entire confidential AI lifecycle—including model weights, training data, and inference workloads.

“Historically, devices such as GPUs were controlled by the host operating system, which, in turn, was controlled by the cloud service provider,” notes Krishnaprasad Hande, Technical Program Manager at Microsoft. “So, in order to meet confidential computing requirements, we needed technological improvements to reduce trust in the host operating system, i.e., its ability to observe or tamper with application workloads when the GPU is assigned to a confidential virtual machine, while retaining sufficient control to monitor and manage the device. NVIDIA and Microsoft have worked together to achieve this.”

Attestation mechanisms are another key component of confidential computing. Attestation allows users to verify the integrity and authenticity of the TEE, and the user code within it, ensuring the environment hasn’t been tampered with. “Customers can validate that trust by running an attestation report themselves against the CPU and the GPU to validate the state of their environment,” says Bhatia.

Additionally, secure key management systems play a critical role in confidential computing ecosystems. “We’ve extended our Azure Key Vault with Managed HSM service which runs inside a TEE,” says Bhatia. “The keys get securely released inside that TEE such that the data can be decrypted.”

Confidential computing use cases and benefits

GPU-accelerated confidential computing has far-reaching implications for AI in enterprise contexts. It also addresses privacy issues that apply to any analysis of sensitive data in the public cloud. This is of particular concern to organizations trying to gain insights from multiparty data while maintaining utmost privacy.

Another of the key advantages of Microsoft’s confidential computing offering is that it requires no code changes on the part of the customer, facilitating seamless adoption. “The confidential computing environment we’re building does not require customers to change a single line of code,” notes Bhatia. “They can redeploy from a non-confidential environment to a confidential environment. It’s as simple as choosing a particular VM size that supports confidential computing capabilities.”

Some industries and use cases that stand to benefit from confidential computing advancements include:

  • Governments and sovereign entities dealing with sensitive data and intellectual property.
  • Healthcare organizations using AI for drug discovery and doctor-patient confidentiality.
  • Banks and financial firms using AI to detect fraud and money laundering through shared analysis without revealing sensitive customer information.
  • Manufacturers optimizing supply chains by securely sharing data with partners.

Further, Bhatia says confidential computing helps facilitate data “clean rooms” for secure analysis in contexts like advertising. “We see a lot of sensitivity around use cases such as advertising and the way customers’ data is being handled and shared with third parties,” he says. “So, in these multiparty computation scenarios, or ‘data clean rooms,’ multiple parties can merge in their data sets, and no single party gets access to the combined data set. Only the code that is authorized will get access.”

The current state—and expected future—of confidential computing

Although large language models (LLMs) have captured attention in recent months, enterprises have found early success with a more scaled-down approach: small language models (SLMs), which are more efficient and less resource-intensive for many use cases. “We can see some targeted SLM models that can run in early confidential GPUs,” notes Bhatia.

This is just the start. Microsoft envisions a future that will support larger models and expanded AI scenarios—a progression that could see AI in the enterprise become less of a boardroom buzzword and more of an everyday reality driving business outcomes. “We’re starting with SLMs and adding in capabilities that allow larger models to run using multiple GPUs and multi-node communication. Over time, [the goal is eventually] for the largest models that the world might come up with could run in a confidential environment,” says Bhatia.

Bringing this to fruition will be a collaborative effort. Partnerships among major players like Microsoft and NVIDIA have already propelled significant advancements, and more are on the horizon. Organizations like the Confidential Computing Consortium will also be instrumental in advancing the underpinning technologies needed to make widespread and secure use of enterprise AI a reality.

“We’re seeing a lot of the critical pieces fall into place right now,” says Bhatia. “We don’t question today why something is HTTPS. That’s the world we’re moving toward [with confidential computing], but it’s not going to happen overnight. It’s certainly a journey, and one that NVIDIA and Microsoft are committed to.”

Microsoft Azure customers can start on this journey today with Azure confidential VMs with NVIDIA H100 GPUs. Learn more here.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

Housetraining robot dogs: How generative AI might change consumer IoT

As technology goes, the internet of things (IoT) is old: internet-connected devices outnumbered people on Earth around 2008 or 2009, according to a contemporary Cisco report. Since then, IoT has grown rapidly. Researchers say that by the early 2020s, estimates of the number of devices ranged anywhere from the low tens of billions to over 50 billion.

Currently, though, IoT is seeing unusually intense new interest for a long-established technology, even one still experiencing market growth. A sure sign of this buzz is the appearance of acronyms, such as AIoT and GenAIoT, or “artificial intelligence of things” and “generative artificial intelligence of things.”

What is going on? Why now? Examining potential changes to consumer IoT could provide some answers. Specifically, the vast range of areas where the technology finds home and personal uses, from smart home controls through smart watches and other wearables to VR gaming—to name just a handful. The underlying technological changes sparking interest in this specific area mirror those in IoT as a whole.

Rapid advances converging at the edge

IoT is much more than a huge collection of “things,” such as automated sensing devices and attached actuators to take limited actions. These devices, of course, play a key role. A recent IDC report estimated that all edge devices—many of them IoT ones—account for 20% of the world’s current data generation.

IoT, however, is much more. It is a huge technological ecosystem that encompasses and empowers these devices. This ecosystem is multi-layered, although no single agreed taxonomy exists.

Most analyses will include among the strata the physical devices themselves (sensors, actuators, and other machines with which these immediately interact); the data generated by these devices; the networking and communication technology used to gather and send the generated data to, and to receive information from, other devices or central data stores; and the software applications that draw on such information and other possible inputs, often to suggest or make decisions.

The inherent value from IoT is not the data itself, but the capacity to use it in order to understand what is happening in and around the devices and, in turn, to use these insights, where necessary, to recommend that humans take action or to direct connected devices to do so.

Download the full report.

This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff.

❌
❌