Forr the Planet: Sustainability Interview Series — Dr. Suku Naair, SMU AT&T Center For Virtualization
Forrester’s sustainability team studies the intersection of technology platforms and sustainability. We spoke to Dr. Suku Nair is one of the most well-respected computer scientists in the nation. He is currently the director of the SMU AT&T Center for Virtualization and a distinguished professor at Southern Methodist University’s Department of Electrical and Computer Engineering.
This conversation is fascinating and informative about edge computing, digital twins and other topics. Stay tuned for our report that will explore this balance in more detail later in the year. You can find my previous interviews in this series.
Abhijit : My work reveals that sustainability is synonymous to optimization. Any optimization that you iteratively improves yields more efficiency and thus more performance. This will help you move towards sustainability. What is your view of this?
Dr. NairEngineering is my perspective when I speak about sustainability and being green. Engineers always strive to achieve maximum results by using less. When we speak about reliability, security or any other design topic, we try to find ways to optimize performance through a better design. There are many arguments about why network connectivity and security should be improved. The idea is that while you may have the resources today to pay your rent, tomorrow might not. This was evident in many cases during the pandemic. Optimization is the foundation of engineering. This is what excites and motivates me about sustainability. Engineers have thought about it all along history. Even though many technologies were created for different purposes, I believe that optimization will result in some sustainability-related metrics.
Abhijit, Where do you see more optimization in the infrastructure stack? In a data center for instance?
Dr. NairEverything runs through data centers when we talk about digitization, digital economy and services. Cloud is also nothing but data centers, which many people don’t realize. We see a trend to migrate assets and services into the cloud. Sometimes, enterprises view this only as a cost-saving tool. However, cloud migration can have other benefits than reducing opex and capex. Cloud integration also offers many other benefits. Numerous studies have also shown that enterprises can reduce their carbon emissions and energy consumption by moving to the public cloud.
A public cloud strategy may not be right for every company. One study found that 20 million cars would be removed from the streets if every company moved to the cloud. This is something that not many people realize: how efficient consolidated and optimized infrastructure can be.
The architecture is constantly evolving, so that does not mean the story ends there. We now have the on-premises to edge and cloud continuum. Services will still be hosted on servers or private cloud, but we will see an increase in the use of edge computing infrastructure. This is becoming increasingly common with core cloud.
How will this affect energy consumption? Edge computing is a reflection of the elastic nature technology. At first, you want everything to be moved to a central cloud infrastructure. Then people realize that in order to achieve optimal latency and user experience, some services or portions of services must be hosted on the edge of the network. If you have lots of data that is being stored and processed at the edge you don’t need to send it to the cloud. Transmission is actually more energy-intensive than actual processing. By keeping data local, you can provide a better computing experience due to the low latency. You can also achieve better privacy, greater flexibility, and possibly lower energy consumption.
Another analogy is that the edge is becoming the coprocessor. In the beginning, there was the main CPU. If you need to speed up a specific computational function, you can use a coprocessor, which is a special processor. The cloud is now the CPU, and the edge is the coprocessor. This allows for selective optimization of different services while data comes directly from the customer.
This continuum will be carefully managed. This state has a strict energy balance. New applications will also be coming, so new energy budgets must be considered. We also need to find more optimization technologies.
Abhijit: We are currently looking at how different technologies can aid or adversely impact sustainability. What are your thoughts on edge? Do you have more data?
Dr. NairData collection is growing. Sentient computing vs. energy efficient computing is one example of the major discussions currently underway. Sentient refers to when you “sensorize” the entire world. This could be smart cities, smart apps or smart transportation. All over the ecosystem, Internet-of-things devices (IoTs) are being sown. It is now up to you to decide if you will collect all of this data and send it back to the cloud to be processed.
Connecting to the cloud with IoT devices will result in a longer latency — for example, 100 milliseconds instead of 10 milliseconds. If an IoT device needs to communicate with the cloud, it will use more power than if it was communicating to an edge server. In our example, 10 milliseconds is one-tenth of the energy consumed. Many of these IoT devices have energy constraints. They are only charged once and do not need replenishment.
We must also ensure that these devices don’t stay on for too long. You should plan your schedule so that the sensors are switched on and off as needed. These types of close control and manipulation can only be achieved if the control is within reach.
The push to “software-ize the stack” is evident if you look at the entire ecosystem. This means that you can have general-purpose computing platforms and different applications can be used to use different software systems. This will result in a reduction in hardware footprint and consolidation, which will lead to less energy consumption and lower carbon emissions.
New ecosystems for edge are being created even from a business perspective. If you take a look at some data centers that are owned by one party, such as AT&T or Equinix, you will see that they could be large entities. However, at the edge it could be a small data center. In the beginning, the industry thought that these micro data centers would be owned by the individual owners. It turns out that micro data centers can have multiple providers/vendors occupying the same space. The ecosystem is changing, and edge units will be shared by many different parties. This is all about minimizing resource consumption.
The back-end algorithms are therefore more complex and difficult to develop. However, once the algorithm is created as a single effort, it can be distributed across any number units and resource sharing can be possible.
Abhijit, Let’s talk about virtualization. Virtualization has been a key initiative to make the data center more resilient, even after years of evolution. What do you think of virtualization tools and platforms? It is at a point when it is no longer necessary to be part the conversation? Are there any more studies in this field?
Dr. NairTake a look at the history and evolution of the SMUAT&T Center for VirtualizationWe first started to talk about virtualization while we were working on projects for AT&T in telecom virtualization. Then we realized that it would be more interesting if the scope was expanded. We decided to make this a virtualization center, so we will talk about enterprise virtualization, telecom virtualization and user experience virtualization.
Let’s discuss the role of virtualization in sustainability.
This concept is now mainstream: digital twins. We should be able virtualize any hardware artifact for all its characteristics that are important or salient to us. Digital twins are used to virtualize entire naval warships.
We won’t virtualize every bolt and nut, but we’ll decide the level of granularity that we want to see, and then virtualize it.
Let’s take flight simulators as an example to illustrate how useful this is. We work with companies that make flight simulators for US Air Force.
Student pilots can spend many hours training in a simulator, learning about various maneuvers and situations. A majority of commercial pilots have done at least 80% of their training in a simulator before they get into the cockpit. Although it may sound unsettling, they can simulate all possible scenarios and learn maneuvers to escape different danger zones. A simulator allows a pilot to crash several times without being hurt. This would be fatal if the pilot was actually flying the plane. We don’t realize how much carbon emissions can be avoided by not letting the pilot fly the real aircraft while they train. This is an example of digital twin in the cockpit.
- best card reader
- stock control system
- supplier relationship management
- connect discount
- facial recognition
- shopify singapore
Now, think about other design projects, such as cars, planes, and artifacts that are on your table. It is possible to create digital twins of the models and then study their properties and characteristics before manufacturing them. It saves time and energy.
We also advocate virtualization in education. Virtualization can be used to virtualize entire labs in physics and chemistry. Although a student might not be able to touch the test tube, she can experience the virtual lab and see how it interacts with other things. She can complete 80% in the virtual lab, then she can do the rest in a lab. We are already decreasing the chemical consumption.
Cybersecurity, supervisory control and data acquisition are two other examples I can offer. Sometimes, we need to purchase expensive equipment from companies like GE and Siemens to teach these skills to our students. Then we use that equipment to find security holes and ways to defend ourselves against them. Simulating or virtualizing large boxes allows us to simulate them in all their details, so students can play with them and learn about the security features. Virtualization plays a huge role. Although it may sound futuristic, virtual twins are already being used at some scale. It will be possible from the design and development phase to the deployment cycle.
Is it ready to go mainstream? Yes. It’s definitely happening. Technologies like AI help automate many processes that don’t require human intervention.
Energy does not refer to just fuel or nuclear energy. It also includes human energy. Efficiency must be attained at all levels.
AbhijitThank you Dr. Nair for taking the time to share your insights on the work that you do in this space.